Performing A/B testing of the primary communication elements of Inbound Marketing – your website, content, emails and landing pages – is a great way to identify effective (and less effective) ways to communicate your message and convert traffic into leads. The feedback is directly from users, and based on reactions to different designs, copy, offers, calls-to-action, user experiences, etc.
But too often, the results of A/B testing are used to identify how a product or service should be marketed. A/B testing, however, should follow – not determine – a well-reasoned content strategy and be used to pinpoint which specific way of communicating the same general message is most effective.
When am I ready for A/B testing?
There's no perfect time for A/B testing. You could be just launching your content marketing plan, in the midst of executing your current strategy, or somewhere in between. The only time you're not ready for A/B testing is when you don't have a plan that outlines what you think is currently happening (your hypothesis) and what you want out of the A/B testing.
Step 1 of the plan: formulate a well-thought hypothesis. If you have some sense of why certain offers or elements aren't working, you may come up with a hypotheses like one of these:
"Our customers leave our website after they don't get the product information on the landing page."
"Our testimonials are too exaggerated and don't result in sales."
"Customers aren't filling out the form because it's too long (too personal, etc.)."
Your hypothesis should be detailed enough so that you can start formulating your A and B variations.
A/B testing execution
A complete set of first round analytics that look at all relevant metrics will give you a baseline comparison for your testing and serve as your control. You'll want to keep this control handy and use it in every test, whether it's email design, landing page copy, navigation, PPC ads, etc.
What kinds of things can you change?
If you're committing yourself to A/B testing, there's no point in stopping at "A" and "B." Test as many variations as make sense and have potential. You can change just about anything that's part of your hypothesis. Keep the goal in mind; you're looking for a way to either confirm or deny your hypothesis when you're done.
Dan Siroker, formerly of Google, worked on the Obama campaign and became well known for his A/B testing work. He tested seven variables of elements on the splash page alone and evaluated the results. The video is lengthly but when get to his examples at 5:37, you can see how adjusting the user benefits greatly increased the results and donations to the Obama campaign. Keep "user benefit" in mind as you watch this video.
What changes have the greatest results?
It should come as no surprise to anyone that the biggest opportunities to improve response come from improved headlines, photography, and other graphic devices. In our experience, Call-to-Action headlines that answer the question "What's in it for me?" are more far more compelling than those that simply indicate what the button is for.
Be mindful of statistical significance while you're evaluating your results. Your data/testing volume needs to be significant enough to draw relatively strong conclusions.
A/B testing can be time-consuming, but you'll get rewarded with information that helps you tailor your message and other elements in a way that makes them more targeted, powerful and effective. And leaves you feeling more confident that you're getting the most from your communication tactics.
Sample Approach To A/B Testing
Much like the example in the video above, you need to identify your goals that shape your hypothesis. I'll demonstrate with an example of something that we currently use in our A/B testing routine.
Hypothesis: A CTA button with the words "download now" will produce higher conversions than our regular text of "FREE Guide."
Test design: We use our traditional CTA design with the "Download Now" text as our control and design a test CTA that uses "Free Download." All other design aspects of our CTAs will remain the same. We run this test long enough to get a significant volume using HubSpot's variation test set up. This feature evenly distributes the CTA to visitors while leading them to the same landing page. We evaluate based on view-to-click rate. This HubSpot function also offers a clicks-to-submissions rate for evaluation too. Here are our two evaluated CTAs:
During our variation testing period, the control group had a views-to-click rate of 2.2% which includes all of the locations this variation was placed on our site: home page, blog side bar and blog entry footers. Our Variation 2 CTA had a views-to-click ratio of 2.0%. While the difference seems relatively small, if this CTA is exposed to 3,000 visitors per month, the extra .2% translates to 6 more landing page visits. Our landing pages convert at over 20%, so that's 1.2 additional leads per month. We'll take it!
Learn more about elements like CTAs and their importance to website effectiveness by downloading our "Website Usability Checklist for Inbound Marketing" now!