Testing, testing: killer conversions through solid A/B testing

 admin  15/May/18  no responses.

Given that you’re reading this, we’re assuming that you regularly invest time in your marketing education. Along your journey, you’ve almost certainly come across the term ‘A/B testing’, yet for those of you who aren’t pro marketers this term may seem elusive.

So here we demystify A/B testing (other jargon for which is ‘split testing’ or ‘bucket testing’) – explaining the why and the how – for conversions that improve month on month.

A/B Testing – what’s the point?

Let’s face it, no matter who your target market is, business consumers are notoriously unpredictable, making marketing campaigns anything but a precise science. What may seem like a winning advert, a great video and a well-designed landing page, can fall miserably flat.

Often, this can leave you with little to go on unless you’re A/B testing. Along with providing the answers as to what may be going wrong (as well as right), A/B testing can also deliver many other benefits, including:

– Better engagement with your content

– Increased website sales or enquiries

– Reduced bounce rates on your website

– Higher conversion values – such as order values of a higher average amount

– Reduced risk of investing too heavily in the wrong marketing decisions

– Reduced cart abandonment

– Increased sales

Now that that’s out of the way, what exactly is A/B testing?

A/B testing has a simple concept – comparing two versions against one another and testing them both to see which converts better. Think of it as an experiment – one in which there are plenty of variables to fiddle with to find out what’s working, what’s not, and what’s worthy of further exploration. Given the mammoth number of objectives that A/B testing can aim for, and for the purposes of keeping this feature super clear, we’re going to take the example of a social advert and run with it. However, the same principles can be applied to many of your marketing tasks – such as your landing page, app, website home page, ecommerce store or search engine PPC.

The social ad – taking a look at many elements

We’re huge fans of Facebook, we like the sheer versatility of adverting options, not to mention the incredibly precise targeting options. So, it provides for fertile ground on which to play around with your marketing campaign.

Let’s take the average Facebook advert – you could mix, match and test the following elements:

– The type of content – such as a video, a free eBook download or a simple image advert

– The placement of the advert – such as in the feed or right-hand column

– The copy – both in the headline and in the body

– The imagery – as well as any graphics/text overlays

– Capitalised words in the copy – Free EBOOK download or FREE eBook download?

A/B testing a social ad: a visual walkthrough 

Here’s an example of a Facebook advert. We’ve outlined your potential options when it comes to elements and ad variations…

– The content: A – Advertises a free eBook guide, B – Advertises a Podcast

– The copy: A – Focuses on the customer’s pain point of cost, B – Focuses on the customer’s desire to win more equine clients

– The capitalisation of the copy: Free EBOOK download or FREE eBook download?

– Image: A – Shows a smiling female team member, B – Shows a smiling horse

PressPoint Bonus Tip

Meanwhile, over on your website, variables could include:

– The location of the call to action

– The exact text used

– The button colour or surrounding space

– The amount of whitespace

– The imagery

Five traps NOT to fall into

For the novice, A/B testing can be tough, at least at first. There’s plenty of missteps to be made, and more than a fair share of pitfalls to tumble into. Here are five of the most common we hear about…

A/B testing mistake #1 – Testing too many elements at once

While you should be aiming to test numerous elements during your testing, you won’t ever be able to draw any certain conclusions as to what’s working, and what’s not, if you test any more than a single element per round.

A/B testing mistake #2 – Creating too few variations

Even if you’re just testing the headline copy of an ad, you really need three or more differing versions to get to grips with how this critically important part of your ad is influencing your audience (or not, as the case may be).

A/B testing mistake #3 – Testing elements with the lowest impact

Unless you’re working with a huge marketing budget, you’re going to have to draw the line somewhere when it comes to what elements you test. This should always include the placement (feed versus right hand column) if you’re testing a Facebook advert, and beyond this, our top three elements would likely be: headline copy, image and content type.

A/B testing mistake #4 – Making assumptions on limited results

A/B testing is about driving results and reducing marketing costs. But the saying that you need to speculate to accumulate certainly applies here. If you spend too little on your A/B testing, your results are going to invalid. You need a healthy chunk of user data to draw on.

A/B testing mistake #5 – Not split testing external elements

So far, we’ve spoken only about the elements that appear on your advert, or where it may be placed. However, there’s really a wide-ranging collection of external variables that could and should also be tested, at least where FB ads are concerned.

In a recent study, marketing powerhouse AdEspresso found the running order of elements in terms of their influence on Return On Investment as follows:

– Countries

– Precise interests

– Facebook ad goals

– Mobile OS

– Age ranges

– Genders

– Ad designs

– Titles

– Relationship status

– Landing page

– Interested in

Notably this test was for a company with a product marketing to the masses. Chances are that your target market is far more refined. So your list could look a little more like this:

– Ad design

– Ad copy, especially the headline

– Your unique value offer

– Ad placements

– Call-to-action buttons

– Bidding methods

– Campaign objectives

A word on Facebook campaign structures

For A/B testing, Facebook hands you a helpful tool that many other platforms don’t – the ability to structure your campaign in a way that can provide the most relevant data. Here’s an overview…

Creating a single set of adverts — all your advert variations within one ad set

When you choose this approach, the good news is that your audience won’t be presented with every advert variation, which will happen when you use multiple sets. The bad news? This campaign structure will lead to Facebook auto-optimising your adverts, meaning that you won’t receive relevant results.

Multiple single-variation advert sets — each variation in a separate set

This second structure gets around the problem of Facebook optimising your adverts with only minimal data to go on. However, you may find that some of your audience are shown numerous ad variations over the course of your campaign. The other way of looking at this is that it could lead to richer results that identify what it is that gets these people clicking.

PressPoint Pro Tip

Cutting to the chase for this section, if you want valid testing results, you’ll need to set your campaign up with each variation in a single ad set.

Three final tips for effective A/B split testing

1. Test your headlines with differing superlatives

Super-do-what now? Let’s clear that term up first – superlatives are adjectives (a word that describes or clarifies a noun, such as: “an old mare”, “a young horse” or “a strong stallion”) or adverbs (a word used to modify a verb, an adjective, or another adverb, such as: “the colt runs quickly”, “he ran the race incredibly slowly”).

An extensive study of headlines by Outbrain discovered that headlines with negative superlatives performed a staggering 30% more effectively than headlines with positive superlatives. Rather than “always” or “best” performing well, it was words such as “never” or “worst” that created the most engagement.

2. Trial other headline strategies

Headlines that include questions, statistics or a promise of a step-by-step how-to solution, and directly addressing the audience, can all be tactics that work well for conversion, and are more than worthwhile split testing within your campaign.

3. A/B test the entire user journey

So, your split testing went well, and you now have a single, seriously effective advert when it comes to conversions. But what about the journey onwards? There’s little point in perfecting your advert if the landing page or website destination is going to be disjointed. Make sure your design, brand colours, tone of voice and imagery all fits together, and most of all ensure that you make good on any promises provided in the advert – such as resolving a problem, providing information, or offering a free eBook download.


A/B testing takes a lot of time, plenty of effort, and an ongoing commitment to studying stats and data. If you’d prefer to be crunching numbers that include £ signs, rather than user data, you may want to leave the marketing campaign and testing to us. Call our team on 01953 851513, or send a message over and we’ll be right back in touch –[email protected].



By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.