In January, I shared a post on Crazy Egg’s new priority as a company: our mission is to help you get better at paying attention to the people visiting your website, so you can improve their experience.
We believe (and have seen firsthand) the undeniable downstream impact of placing the focus on paying attention.
Paying Attention to What People are Doing
Since then, we’ve spent a lot of time paying attention to the people using Crazy Egg; focusing on where they spend their time, what they avoid, and perhaps most interestingly of all: where they start to spend time and ultimately get stuck.
These insights have helped us transform our onboarding by asking people setting up Snapshots about their goals and priorities.
It’s helped us work on a partnership with SurveyMonkey to add surveys to Crazy Egg.
It’s also helped us redesign our Dashboard, placing User Recordings in the context of specific Snapshots so people can dive deep into visitor journeys right from where they were already looking.
You’ll learn more about this and some other things we’ve got in store in our next set of Release Notes.
People are Getting Stuck on A/B Testing
One of the most interesting insights we’ve discovered over the past few weeks is the high volume of visits we have to our A/B Testing tool and the rapid drop off in people ultimately creating a test.
Crazy Egg Note: One of our big differentiators is our testing tool. The fact that we don’t just give you data, we also put you in a position to take action on what you’ve learned so you can test new ideas instantly.
The insights we’ve gleaned from conversations with 1350+ customers has been enlightening. It’s not that people don’t want to test. It’s not that people don’t have ideas about what to test. It’s that people are either too anxious to test or, simply, don’t have enough time.
Or so they think.
Wayne Gretzky Would Have Made a Great Marketer
You miss 100% of the shots you don’t take. You fail at 100% of the hypotheses you never test and the tests you never run.
In the spirit of paying attention to this aspect of our experience where people appear to be getting stuck, we reached out to Deborah O’Malley, Founder of Guest the Test, to get her ideas on what could be happening with our customers.
Deborah is also the Co-Host of the A/B Testing Summit, a virtual event from June 11th – June 13th. Registration is free and it features 30 speakers that are experts on helping you optimize your digital experiences and grow your revenue.
Crazy Egg is honored to be speaking on the very subject of paying attention at the Summit; we are not a financial sponsor — just fans of the event in general.
In this interview, Deborah shares her advice for how people can get unstuck and start testing quickly.
Deborah, tell us a little about your background and GuessTheTest. What makes you so passionate about the power of testing?
I unknowingly ran my first “A/B test” at the tender age of eight, for a grade school science experiment.
While most people were looking at things like how much water soil absorbed, I was interested in learning about what catches peoples’ attention. I cut-out different shapes and sizes of colored construction paper, posted them on a board, and asked people which they saw first. The results were a mix-match of different answers, and didn’t yield much insight. But, provided an early glimpse into understanding the value of conducting solid, sound experiments.
Fast forward a circuitous path and many years later. I ended up doing a master’s of science degree, specializing in eye tracking technology — where I was able to quantify what people saw and responded to in digital ads.
Everything came full circle. And, fueled my passion for merging quantitative data with psychology to inform and understand human behavior.
My master’s launched me first into the field of User Experience (UX), and soon after led me to analyzing, producing, and reporting on A/B test case studies for clients.
With this experience, I founded GuessTheTest to share interesting case study outcomes and findings with digital marketers. On GuessTheTest, you can take your best guess on the Featured Test, see if you win, and apply the findings to inspire your own digital marketing success.
Bonus: Want to see how I approach A/B testing and website analysis live? Check out this webinar I participated in with Crazy Egg!
When I tell you that 80% of the people who visit our A/B Testing Product walk away without creating a test, what’s your reaction?
It’s disappointing to hear, but not super surprising.
As described in Crazy Egg’s talk for the A/B Testing Summit, most people use SaaS products just two to three times a month!
So, why do we purchase software solutions and not use them?
Time and fear.
Testing, especially, takes time. We need time to set up an A/B test, learn the platform, tease out a sound hypothesis, and monitor results.
Testing can also be scary. We might worry we’re not setting things up properly, or getting results we can trust.
We’ve been indoctrinated with the idea that we must do things right — or not do them at all. So, we choose the latter.
The great thing is, a test is just that. A test.
If it doesn’t win, it doesn’t need to be implemented. And, the learnings you’ll get from the results — whether positive or negative — will help inform your direction so you can confidently move forward and optimize future gains.
Yes, it takes time to learn a new platform. Yes, it’s scary to dig in. But, once you have, the reward far outweighs the risk. It’s an investment you’ll be glad you made.
What advice do you have for the 80% of people who obviously have an interest, motivation, or desire to run a test, but hesitate to do so?
Outstrip the majority.
Take 20% of your time and make it worth 80% of your success. Set aside two hours, once a month and just dig in. Or as Crazy Egg recommends, set aside 15 minutes once a week to get unstuck, build momentum, and get started.
With a platform like Crazy Egg, you can do a lot — and learn how to do it quickly. Once you’re up and running, setting up an A/B test won’t seem time consuming, or scary. It will become second nature. And you’ll be able to learn SO MUCH from your findings.
So, just dive in and do it. You’ll be glad you did!
We often hear that people are anxious about running a test successfully. How do you help people think through the anxiety of properly structuring a test?
Setting up and running an A/B test for the first time can feel super scary. You might feel you don’t have enough knowledge to competently be testing. You might be afraid you haven’t set up the test properly. Or, you might worry you won’t have enough traffic to get reliable results.
These are all valid fears. But, they shouldn’t stop you from testing.
To get up to speed on testing know-how, there are plenty of resources out there that can help. Crazy Egg’s site and help center has fantastic articles. GuessTheTest offers inspiring ideas, and there are many great articles, like this one and this one, devoted to helping beginners get started with optimization and A/B testing.
If your test isn’t set up properly, it won’t run properly. You’ll find that out pretty quickly. But if you’re in a jam, you can always reach out to customer support to get help. So, that fear can pretty much be erased.
Not having enough traffic is a true testing barrier. To get valid, reliable results, you need a large enough sample size. The problem is, it’s really hard to define what large enough is with an exact number. It changes based on several factors like significance level and power.
The good news is you can use a sample size calculator like this one to determine the sample size you need to run a valid test.
Even if you don’t have enough traffic — and probably won’t for a very long time — don’t let that factor discourage you from testing now. You can still run a test with limited traffic. Just be aware, if you are going to implement the “winning” result, it may not hold true for all your users, over time. So, it’s especially important you monitor results to ensure any change you’ve made has a long-standing, positive impact over time.
Crazy Egg note: The great news about our own multi-armed bandit approach to A/B testing is we take out the hard work of you needing to figure out a sample size. We adjust the traffic to your variants based on conversion rates so you never miss out.
When you look at it this way, there’s really no excuses for not testing.
Is there a favorite case study you use to showcase the power and impact of getting people started on A/B Testing?
One of my favorite case studies is a test I ran for a small, local law firm client.
The client knew the value of taking a data-driven approach and already had heatmapping data set up, but didn’t have the time or knowledge to optimize his site himself. He hired me to help.
Looking at the heatmaps combined with Google Analytics data, I saw a big problem right away: people were bouncing heavily and not engaging beyond the hero image.
Despite all the fantastic information available on the site, no one was reading it. And, even fewer were booking consultations, converting into potential clients.
Adding an exit intent pop-up that captured bouncing users’ contact info seemed like a no-brainer.
But, we didn’t want to do so blindly so we decided to test.
Our first ever test looked at the effect of adding a pop-up that captured prospects contact information upon exit. Preliminary results showed the tactic seemed to be working.
But, there was consensus it could be optimized even more.
Going back to the data, it looked like most people didn’t want to fill in their email contact info; they wanted to speak with someone on the phone right away.
So, we tested the effect of adding a phone number people could call in addition to the email capture.
Bingo! We struck gold.
Compared to the original landing page with no pop-up at all, adding an exit intent pop-up with an email capture and contact phone number increased conversions 80.35%.
This number resulted in several new, high paying customers in need of the client’s legal help.
Now, it’s important to keep in mind that the client has a small local law firm focused in a niche area of practice. His website traffic is low. He doesn’t have millions of visitors per month. He has hundreds. But, just a few extra customers translates to many thousands of dollars in extra revenue for the law firm.
So you’re probably wondering. Are results from the A/B test valid?
Not entirely. They’re not rooted in a strong level of confidence. They’re based on a very limited sample. And to gather sufficient data, the test needed to run beyond the recommended 6-8 weeks of time.
“Best practices” weren’t completely followed.
But it’s likely the client will never get enough traffic to run a completely valid A/B test.
So there were two choices: sit there and wait. Or run a test and evaluate the results.
Running this test enabled the client to achieve a solid double-digit conversion lift and make a whole lot more revenue. In the meantime, we will continue to monitor the results to make sure the change is having a long-standing positive impact.
So far, so good!
Tell us about the A/B Testing Summit and what you think people will get out of it by attending – perhaps that they wouldn’t get anywhere else?
The A/B Testing Summit is one of the largest online A/B testing and digital marketing conferences.
Over three days, from June 11-13, you’ll hear from knowledgeable speakers covering a wide range of A/B testing topics from experimentation to analytics to personalization.
All sessions are pre-recorded so you can watch the webinars at your own convenience. The speakers have a wealth of knowledge to offer you to improve and inspire your own testing success.
You can register for the Summit for FREE.
Catch Crazy Egg and 29 Other Conversion Experts at the A/B Testing Summit!
Bonus A/B Testing Webinar
Join us for A vs. B: The Ultimate Testing Champion, a *live* 1-hour discussion panel on June 13th, and learn to design and deploy tests—plus how to action results (without it being completely overwhelming).
We’ll be chatting with marketing experts from PathFactory, Optimizely, ON24, Demandbase, and Vidyard.
Note: Top image by Emily Morter