EDU SERIES – PPC: Part 6 of 6 – A/B Testing
A/B testing stands for “Always Be Testing,” right? It should! Once you start testing various elements of your marketing campaigns, you begin to understand the massive benefits. However, you never know exactly what’s going to work with your audience until you A/B test it so there are no hard and fast rules. So below, we have put together a few results we’ve found over the years.
A/B Test Where Your Form Is on The Landing Page
Many of the best AB test results have come from testing things that don’t necessarily fit into best practices. One of the most exciting results seen from an A/B test came from simply moving the form on a landing page from the standard right side, to the center of the landing page.
The existing landing page was already pretty tightly optimized and had a conversion rate of about 11%. By making a simple change and moving the form to the center, we were able to increase conversion rate by nearly 50% to just a hair under 16%.
Small A/B Tests Can Have Profound Impacts
Back in 2007 I tested URL initial capitalisation in AdWords ads and achieved a 53% lift in click-through rate.
The test was simple. I created two exact ads except for the URL. In one, I used the standard www.sitename.com while the other used initial capitalisation and looked like www.SiteName.com. The test was repeated numerous times and always saw a positive lift.
Sadly, Google eliminated this type of URL display. However, at the time, it reconfirmed that small changes could have a profound impact and revealed more about how users “looked” at search results.
Your A/B Tests Don’t Have To Be Perfect
We were working with a client who had a terrible landing page. It wasn’t well designed and you had to click on a button to go to a terrible form page to convert. I begged them to create a new page and at least put a clean form on the landing page. They came back with a page that still wasn’t pretty and had their terrible form just embedded on the page.
The new version increased the website’s profit 76%.
That’s when the perfectionist brain changed and realised: You don’t have to be perfect. To achieve greater results, all you need to do is be better than you currently are.
A/B Tests Show That Small Wording Changes Matter
For PPC ads, we never cease to be amazed by how slight changes in emphasis can produce enormous improvement in click-throughs. Here’s a case in point from last week. We were AB testing these calls to action:
- Get £10 off the first purchase. Book online now!
- Get an additional £10 off. Book online now.
The CTR doubled with option … B.
A/B Test Reveals Having More Steps (Not Less) Yielded Stronger Conversion Rates
When we tested 2 landing pages, the control presented the product and add to cart options versus a page that positioned the product, but was one click away from the actual product.
Despite the extra step and more language to get through, it substantially out-performed the more direct path. It had a stronger conversion rate (+18% with 95% stat significance reached) and a higher AOV. It helped us recognise where we were engaging traffic within the buying process and as a result we were able to use the findings to better position other products prior to the sale.
Test A New Call-to-Action Button
Another great use of A/B testing was being able to lift conversions by simply testing a new call-to-action button. For this client, we did not have the resources to create and test all new pages. We decided to try splitting traffic between two different button designs.
We ran the AB test for just over seven weeks, and in the end, were able to increase the form conversion rates by 11%.
Never count out testing the smallest elements. You can easily increase your conversions and revenue even with limited resources.
Long Conversion Form Beats Short Conversion Form
In one instance “excessive content” won over “usability & conversion focus.” We approached one of our PPC marketing clients about creating a new landing page that would encompass a “hybrid model” consisting of the most important aspects of the product they were selling. It was a surprisingly hard-sell to the client who told me that even though he understood the strategy, his audience would rather read dozens of pages, than having it all in a compact, easy to read, above the fold, monetisation-friendly format.
End Result: The new landing page had a higher bounce rate and lower conversions as compared to the existing one. This experience taught one key lesson. Keep AB testing and listen to your clients!
Resist The Urge To Follow Best Practices
You can create a hypothesis based on years of experience and hundreds of successful tests, but outcomes are impossible to predict – especially when people (website visitors) are involved. We were working with an e-commerce client who developed a new landing page. This new page was gorgeous and we were 100% sure it would increase conversion rates exponentially. However, the old page won the A-B test. This old page had bad graphics, confusing layout, small and poorly written copy – and these are just a few shortcomings of the page.
Our initial inclination was to just switch over to the new page. This is how confident we were in the new page. We were glad that we didn’t because we wouldn’t have such a great (and frustrating) learning experience.
A/B Tests Can Reveal Visitor Sentiment
We had the thought that allowing users to click out of the funnel on a website was causing a lot of dropoff, so we wanted to see if removing the navigation would help people convert.
What happened, actually, was that users felt trapped and our bounce rate shot through the roof! So actually, by allowing people to have the freedom to navigate away from the funnel, we had a higher conversion rate than when we constrained them! Counter-intuitive, but an absolutely crucial lesson.
A/B Testing Offers At the End of Sales Copy vs Offer Above the Fold
One of the most surprisingly successful yet head-smackingly simple results we’ve ever seen with AB testing happened a few years ago where we took the same exact offer/CTA that lived at the end of the sales copy on an SEO landing page and repeated it again above the fold, just below the intro paragraph. The AB test yielded more than a 400% increase in conversion rates. The logic behind the experiment was that we’d immediately interrupt a user’s flow as they read the landing page copy–which was long given the page was structured as an organic traffic asset–and give them the option of converting sooner vs later. Before making the change, we had no idea it would be as successful as it was since it went against what we believed about creating a quality organic landing page experience. Interrupting flow, even before making a solid case for benefits and the USP, and repeating offers seemed somewhat counter-intuitive and heavy-handed at the time. But then, that’s why we A/B test. We were able to replicate the results on other pages on the site as well, which only reinforced the practice.
Split Testing Buttons By Color
We created a remarketing campaign targeting lost sales (those who abandoned the shopping cart). We designed two different sets of ads that were primarily the same. Everything including the font, offer, and imagery were all identical. The only difference was the color of the “Shop Now” button. One set of ads used a gray button while the other set used bright green. We rotated the ads evenly for two weeks and were very excited by the results. The set of ads with the green button converted at three times the rate of the ads with the grey button. Additionally, the green button ad click-thru-rates were considerably better than those of the gray button ads. The green allowed the ads to stand out much more, thus allowing more lost sales to see our great offer. The notion of standing out on the Display Network was absolutely confirmed in this test.
Challenge Conventional Marketing Wisdom
When a landing page with benefit-driven copy lost to a landing page with product-focused copy. In the tests run, benefits almost always outperform features. And as we all know, a benefit-focused message is a critical element of successful marketing. Yet, in this one campaign, visitors actually wanted to know the features of the product we were selling. So depending on your target audience, the stream of traffic, the message of your ad, your product etc., it might actually make sense for you to focus on features instead of benefits. But you won’t know that until you test.