Do you ever sit at your desk stressing over whether or not your prospect would prefer the blue call-to-action background or the green one? Whether the word "FREE!" is too cheesy for your target? There are so many small (often seemingly unimportant) decisions made regarding content offers and how they're promoted, each of which could have a significant impact on your prospect's decision to convert – or leave to go to a competitor's website. By conducting A/B tests that present different options of the same offers, you're able to learn what appeals to your prospect and, by using what you learn, improve your conversation rates. Here are some tried-and-true best practices for conducting A/B testing.
Determine what you're going to test
There are lots and lots of components of a content offer that you could A/B test (font, size, shape, colors, image, wording, placement...), but you don't have time to test 'em all. Start with those that make the biggest impact: Are your headlines performing? Are your landing pages' meta descriptions compelling enough? Is the copy on the call-to-action (CTA) enticing enough to get visitors to click on it? Does the image or video on your site engage your audience? These are the primary features of a content offer's CTA and tend to have the greatest influence on a prospect's likelihood to convert (well, in addition to the content's alignment with his or her needs, of course). And don't try to test more than one thing at a time or you won't undertand what it was that triggered the conversion.
Create an 'A' version and a 'B' version...maybe even a 'C' version!
After deciding what single feature you want to test, you need to create variations of that feature. You can have as few as two and up to several variations of a headline or CTA copy or supporting imagery.
For instance, for a CTA button you could try two or more of these wording options:
- Download Now
- Learn More
- Get the eBook Now
- Get Your FREE Copy
- Yes, I Want This!
Don't rely on assumptions
When you're creating your A/B test, don't assume that you know what will resonate unless you have recent statistics to back it up (then it's not an assumption anymore). You may think that “Learn More” is better copy for a CTA, but in fact “Download Now” may be a better option.
In this TNW Conference talk from Optimizely, Dan Siroker reviews how they improved a landing page for the Obama campaign. He covers variations they created for the landing page with the audience and polled them about what they thought would work best.
Spoiler alert: the audience was overwhelmingly wrong.
Don’t assume that you know what's going to resonate best with your prospects; listen to them and look at the analytics!
Determine what you want to measure
I can't emphasize enough how important this step is! If you don't fully understand what you're trying to improve, don't bother A/B testing. Understanding how you measure success for that tactic is critical. For example, if you're changing a CTA button text, write down a hypothesis with actual number estimates for which verbiage will receive the highest CTR (click-thru-rate). In this example, your goal would be to increase the number of clicks on the CTA.
Determine how you're going to measure your KPIs
Once you know what you want to measure, you need to determine how you're going to measure it.
If you have a website platform that provides analytics, such as HubSpot or Squarespace, it may already be set up to gather the data you need. If not, there are numerous other tools such as Google Analytics, Google Tag Manager, HotJar's funnel tracker, and more. These tools will allow you to track specific metric goals to see if you're on track to hit your objectives.
Be very specific about how you're going to measure your KPI so there's no ambiguity in the data. Fuzzy data is bad data.
Additionally, you need to make sure you're filtering out your own internal numbers (people in your organization who are clicking on CTAs) so you don't skew your results. This is less important for websites or landing pages with lots of visits, as your internal percentage will be low, but for more niche products and services it's extremely important.
Set up your test
Once you have your test parameter variables determined you can set up your tests. There are many platforms you can run the tests with:
No matter what platform you're using, you'll want to have a control and a challenger. The control is the landing page or website you already have live; the challenger is the version with your new copy, image, or other parameter that's new/different.
The platform you choose should have the ability to segment your audience randomly between the control and the challenger.
Run your A/B test
Once you have your test set up, it's time to sit back and let it run. Make sure to determine a set amount of time for the first test period. It could be a week or it could be 1,000 page views...you decide, but give it enough time to truly see a pattern.
Your testing platform will begin randomly sending visitors to your control and challenger pages even though they all click on the same link. Having both live at the same time allows you to control for changes in audience and time period.
Analyze the results
Once you've let the test run for the set amount of time, you need to analyze the results based on the metrics of success you defined earlier. Which CTA button text actually got the most clicks? Which graphics drew the greater number of conversions on your content piece?
You may be surprised at the results. It could be that the variation you thought would drive more results, in fact, caused your results to drop. Even if your results prove your hypothesis to be wrong, you still walk away from the experiment more educated about your prospects than before you started.
Use the results
Once you have the results, implement what you learned throughout your entire strategy. If you know that certain text leads to more conversions, change the rest of your CTAs to reflect this. Too often we test something but then do nothing with the results; don't let yourself fall into that situation.
You had an idea. You tested it. You got a results that told you if your hypothesis was right or wrong.
Great! Now do it again!
It's kind of like compounded interest: every little bit adds up on top of the other little bits until you have a large amount of change. By constantly testing and iterating on your A/B results, you'll have a far better long-term return on your content.