Evolve your email marketing with A/B Split testing

It’s been a busy day, you and your team have been working hard on this week’s email newsletter. It’s looking good, the content is fresh and relevant, you’ve got something to say to your readers. In fact you’ve been focusing so hard on the content that you haven’t really had the time to take a step back and reflect on your overall email strategy. The weekly newsletters are doing well, lots of opens and a good click through rate. The conversions are always good, so you are confident that what you’re doing is right – it’s working well, after all.

But, could it be better? A small increase in conversions could mean a big uplift in revenue generated. A few more social shares and the reach of your brand is significantly increased.

In the always busy world of email marketing it can be difficult to make time to try out new approaches, talk to your audience in a different way. Aside from the resource hurdle there’s the risk factor too. Dare you put all your eggs in one basket with your weekly newsletter? What if the new tactic doesn’t improve results, or worse, performs less well than the tried and tested usual way?

It doesn’t have to be this way. Whilst the digital world enables us to be agile and disruptive, pushing boundaries, a safe way to evolve your email marketing strategy exists and it’s more than likely already at your fingertips – multi-variant split testing (or simple A/B Split testing).

To always be testing should be one of your (many) email mantras. Your programme arrived at its current stage by being tested on some level during its development. If you can speed up the evolution process then your strategy can only grow and improve at a faster rate. Which is, after all, what we are all looking for – better results, improved ROI, simplified processes.

Sounds fair enough, how do you put this A/B Split testing into practice?

There are many elements that make up your email message and each has it’s own effect on the performance. The mail from and subject line will have a key influence on whether your email is actually opened by the recipient. Once opened, timing of the send and the layout of the content will determine engagement with your message. Even the colour and shape of your CTA buttons will affect the performance of the email newsletter.

There’s a lot of elements to be tested, where on earth do I start?

If you aren’t testing any elements at the moment then its a good idea to start small, use A/B split tests on the obvious factors. Since performance ultimately rests on how many people read your message, testing the mail from and subject lines is a great place to start.

Using the A/B Split Test feature in your the email solution that you have (if your ESP doesn’t have this feature then it’s definitely time to take a look around at other providers on the market!) you should decide on a simple subject line test – running three different subject lines against each other.

Typically this involves selecting the number of splits and the ‘winning’ factor (in this case Opens). You then define the testing window, e.g. 3 hours. You then broadcast your newsletter as normal, except in this instance three different versions, each with a differing subject line, will be mailed. But to only a small proportion (e.g. 10%) of your database. Once your time-frame is over the ‘winning’ version will rolled out automatically, to the remainder of your database.

This simple and quick method gives you the opportunity to try different styles of email subject line. You can put the usual against a quirky line against an informative line. What you use depends on the content and aim of the email you are sending.

With this easy to implement test you can send 3 or more different versions of your newsletter every time you broadcast. Confident that the best performing version will always be seen by the vast majority of your audience. Each weekly iteration evolves your email programme.
Very soon you will have moved from the safe, but not necessarily ideal, variant and be confident that you are serving your customers messages that appeal to them.

As mentioned above, the subject line is just one factor to test. Once the benefit of testing has been shown you are now in the position to begin the journey towards testing nirvana. It can be seen that slight variations in design – from layout through themes to button shape and placement, etc.

Use A/B Split testing on these elements to find out which works best for your audience:

  • Subject line style – quirky or dry, % discount or amount saved, etc.
  • Mail from – does a person’s name or the business name speak to your audience better?
  • Time of Day / day of the week – even with the rise of always on mobile devices, timing can be crucial.
  • Image or copy heavy design
  • layout styles
  • button shapes / placement