First of all, I need to give a shout out to a commenter on one of my previous articles who (in part) inspired this piece. The comment came from Mitch Rezman, and is as follows:
Mitch actually raises two issues that I want to deal with separately.
The first is whether or not CRO services can ever be guaranteed (hint: no) and the second is about the flooding of the market with bad CRO testers, and how to spot the snake oil salesmen.
“I can guarantee your site the #1 spot on Google!”
If you’ve owned or operated a website at any time in the past decade or so, you’ll no doubt have had poorly written, cookie cutter emails from an SEO connected to scammy looking pages making outlandish promises.
In 2015, practices like keyword stuffing, hidden content, spamming forums and so on have now all but disappeared for two reasons:
- The introduction of algorithms that are better at sniffing out good content and penalizing the bad
- Google has done a good job of explaining that these practices will actually have a negative impact on a site’s ranking
The downside to that fact is that all these snake oil salesmen who used to peddle black hat SEO services have had to go somewhere else, and it appears that CRO may be one of the places to which they’re flocking.
There Are No Guarantees In CRO
A lot of consumers and business owners are impatient. Some won’t look twice at a product unless they believe that it can get guaranteed results.
This leads businesses to bend the truth by using statistics to back up claims that are suspect, or sometimes just outright lies. You can’t make guarantees about CRO.
Granted, you can probably find at least one or two tests that will have a positive outcome on even a very well optimized site, but there is always the possibility that you won’t be able to improve conversion at all. This is true for a few reasons:
- Poor PPC/SEO could be bringing in entirely the wrong audience. Changing the site won’t change that.
- Even if a site is bringing in the right audience, every audience is different. What some claim will work for EVERY page may not be true for a particular site.
- Split tests are illusory. Even a test that appears to result in an improvement in conversion may not stick when it’s applied permanently.
The fact that, as Mitch says, some people in the CRO business are making guarantees about their products/services is worrying.
How To Spot A Bad Split Tester
I’d encourage all of you to treat unsolicited emails filled with guarantees from anyone positioning themselves as a “split tester” with caution, especially if they quote a set rate—it’s nowhere near that easy to quantify the value of A/B testing.
If you’re looking for ways to improve conversion, you may be better off working with a digital marketer who has conversion rate optimization as one of their specialties. While tests are running, they can help you with other projects too.
Highly skilled, effective A/B testers are like gold dust, and the best of the best are either a) busy or b) very pricey. But, if you still want to hire someone solely for CRO, watch out for any of the following red flags associated with poor CRO:
1. Offering Guaranteed Results
As I’ve already elaborated on above, it’s no more possible to guarantee a 10% improvement in conversion rate than it is to promise the number 1 spot in Google. Average improvement for previous clients? That’s a number I’d rather see.
2. No Long Term/Unified Goals
While it’s true that most sites will have some “quick wins” that can be drawn up, tested and implemented in a very short space of time, I’d have doubts if all an A/B tester wants to do is change button colors and CTAs.
I want to see big hypotheses about audiences and how they engage with sites, which can be broken down into a number of smaller tests that have a broader goal.
3. Lack Of Research
Come across an A/B tester who wants to start testing on day one? Run away.
Before a skilled tester even thinks about running a single test they’ll be delving into analytics to find weak spots or holes in the funnel, quizzing you about your site/products/services and looking into how your competitors do things.
On the other end of the scale, it’s important that you give testers you’re working with time to look into this stuff and not barge in asking “WHY ARE WE STILL ONLY CONVERTING AT 3%?!” after their first lunch break.
4. Poor Transparency
It’s not a good sign if an A/B tester can’t (or, more likely, won’t) share information about their process with you.
Obviously testers don’t want to give too much away—because if you know exactly what they’re going to do, why would you hire them?—but they should be able to explain the basics.
Again, this goes both ways. Claiming higher user numbers, revenue etc. than you actually have or neglecting to mention specific processes in your business can be a real spanner in the works for a tester.
5. Improper Use Of Metrics
Another way dodgy split testing is like bad SEO is that it’s easy to dress up metrics to sound more impressive than they are. It’s all very well ranking #1 for a specific long tail term, but not much good if no-one searches for it.
Likewise, improving conversions by 700% on a landing page won’t help much if that page only gets 2 visitors per month.
It’s unfortunate that I have to write this post. Until recently, A/B testing has been a very open space with lots of sharing, case studies, best practices etc. Now it seems there’s a lot more of this around…
A few bad eggs threaten to ruin that for everyone by obscuring their tactics and potentially making people skeptical of CRO in the same way that many people still associate SEO with spammy practices.
Or maybe I’m just being paranoid…
What do you think? Have you noticed any recent changes to the way CRO is viewed by those outside the industry?
Featured Image: Back Hat