How’s Your A/B Testing Going… Post-Mobilegeddon?

by Art

Last updated on January 9th, 2018

Before I get into this post, I want to reinforce something that’s common knowledge and expand on why it’s important in the context of everything I’m about to write.

People act differently on mobile compared to how they act on a laptop or desktop computer.

I don’t think there’s anyone out there who would react to a website exactly the same way on an iPhone as they would if they were accessing it on a MacBook Pro.

In fact, apps like Pocket, Instapaper and Reading List are built on the idea that vast numbers of people prefer to:

  1. Send articles etc. to mobiles/tablets to read later
  2. Delay signing up for a new service/product until they’re on a computer

Of course, not every article is read on a mobile and not every signup takes place on a computer but I’d argue that the above are true in the majority of cases.

Ignoring device type causes problems

Once a split test has been set up, most marketers will spend a lot of time (perhaps more than they’d like to admit) looking at a screen that looks something like this:

vwo results view

Once a statistically significant change has been registered, they’ll change the website accordingly. Not so fast! Without looking at the results in more detail, we can’t be sure that the results are legit. Why not? Because…

Split tests don’t always work on mobile

Time for a story. Several years ago, I was working with a company on a split test. We set everything up and checked that it looked OK in Optimizely’s preview.

But, when we got our smartphones out to test it, we found that—despite what the preview had shown us—the send button was out of view in a mobile browser.

Chances are that some people would have figured out that hitting return on their mobile keyboard would have sent the data, but it’s not a certainty that everyone would realize they could do that.

When working on another campaign with the same company, we set up another test that added an additional signup form further up the page. The result was that there was very little change. Except it wasn’t, because…

Results aren’t consistent across platforms

When we looked more closely at the results we saw that, while there wasn’t much change on computers, the number of signups on mobile had increased by around 20%.

The fact that mobile split testing traffic was outnumbered by desktop/laptop traffic by around 5 to 1 meant that it appeared as if there had been no change.

A negative result on one platform and a positive result on another threatens to make even more dramatic results appear as if they’re inconsequential when the opposite is the case.

Some Mobilegeddon specific tips

mobilegeddon armageddon meme

I mentioned Mobilegeddon—a common nickname for a Google algorithm that was released in April 2015— in my post title, so I think it’s high time I started talking about it.

In the past, it’s been very common for marketers to mash landing pages together without paying as much attention to design and whether or not the page is responsive as a permanent page would warrant.

The thinking was that a quick and dirty test could later be refined and polished if it garnered a positive result. Post-Mobilegeddon, this isn’t such a good idea.

You can see from the link above that Google now officially punishes pages that have the following:

  • Text that is unreadable without tapping or zooming
  • Tap targets with inappropriate spacing
  • Unplayable content or horizontal scrolling

If you use landing pages to test hypotheses, you need to be aware that a lack of mobile friendliness could damage your chances of getting a ‘temporary’ landing page turned a permanent addition to your site to rank further down the line.

There is no fold (possibly)

Already separating your mobile split testing and desktop/laptop traffic tests? Sorry, that may not be enough.

we need to go deeper meme

In June of this year, the Unbounce blog posted a controversial piece called There Is No Fold. The gist of the article is that, because there is now a huge array of different screen sizes on the market, it’s impossible to optimize content using ‘the fold’.

If we take that to be true, which I’m inclined to do so, it means that testing with the parameters ‘mobile vs computer’ is no longer good enough.

Marketers have written elsewhere about the fact that spending on smartphones differs by type. For example, 78% of mobile purchases take place on iPhones.

That alone is good enough reason to look at segmenting test traffic by screen size, device type and/or OS if the software you use allows you to do.

Inconsistent results have a big impact of ROI

Splitting tests by desktop and mobile/tablet traffic or using their differences to test ideas can do much more than just rejuvenate a stagnant approach to A/B testing, although it can certainly do that as well.

Based on the points above, looking at the difference between desktop and mobile split testing can help you be much more efficient with your marketing by learning what boosts conversions from web traffic to signup for different audiences, users of devices etc.

Segmenting traffic might require some getting used to, especially if your software package of choice doesn’t make it easy to do so—although, for the record, most of the major ones do—but it’s worth it.

Final Thoughts

A/B testing is now also available for mobile apps, using programs like Optimizely or Apptimize, without the need to wait for approval from (or resubmit) the App or Play Store.

I don’t have room to get into that here, but there’s a great case study on the Optimizely blog about how Secret Escapes used split testing to double signups on mobile and improve LTV.

The key takeaways here are the following:

  • Evidence is mounting that m-commerce is becoming more and more important, but…
  • Different behavior on different devices is important, and should be at the core of most hypotheses.
  • Those different behaviors can have a huge impact on the results of split tests.
  • Ignoring mobile best practices could be damaging if you hope to rank landing pages further down the line.

Do you already segment your CRO activity into desktop and mobile split testing, or use screen size/OS as a factor in splitting traffic? Let us know what happened!

No Comments

DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

Art

Art is a freelance copywriter/content marketer based in the UK. He's worked with both startups and large corporations on topics ranging from fitness to tech and business issues.

NO COMMENTS

Comment Policy

Please join the conversation! We like long and thoughtful communication.
Abrupt comments and gibberish will not be approved. Please, only use your real name, not your business name or keywords. We rarely allow links in your comment.
Finally, please use your favorite personal social media profile for the website field.

SPEAK YOUR MIND

Your email address will not be published.

Show Me My Heatmap

@AndreaFine @CrazyEgg I love using it. That's when my clients really get it. More than any @Google analytics data.

Dagmar Gatell

@DagmarGatell

What makes people leave your website?