Your Complete Guide to A/B Testing Your Emails

What if you could collect customer feedback on the effectiveness of your email marketing strategy without having to ask for it? 

How big do your buttons need to be to get clicked? Which colours catch your audience’s attention best? What subject line buzzwords get you the highest open rates? Enter A/B testing: a data-driven way to fine-tune your design, copy, and set-up choices based on customer insights. 

Most email marketers understand the importance of conversion rate optimisation (CRO), but 35% admit to not testing their strategy, giving those who do an instant advantage. 

Regardless of your industry, if your business model includes an online presence as a touchpoint for customers, tested optimisation will be beneficial. Keep reading for our complete guide to A/B testing for email, including when to A/B test, how to curate a testing strategy that drives results, and the key elements to put head to head. 

What is A/B testing?

A/B testing, or “split testing," is a method of comparing two versions of an email, webpage, or other marketing asset to see which one performs better. Essentially, it involves creating two variants of the same page (version A and version B) that are identical, bar one element that might affect user behaviour. 

This variation could be anything from a headline, a call-to-action (CTA) button, a colour scheme, or anything that might influence the outcome. Traffic is then divided between these two versions to determine which variation achieves a predetermined objective more effectively, such as higher conversion rates or increased engagement.

Why is A/B testing important when sending email marketing campaigns?

The ultimate goal of A/B testing is knowing your audience. How they engage, interact and react to your output. This understanding is what you can leverage to enhance your flows with data-driven insights. This is particularly important in automation, where the goal is to achieve high engagement rates with personalised content delivered at scale. 

A/B testing helps in optimising every aspect of email communication, ensuring higher open rates, click-through rates, and conversions, to ultimately maximise the return on investment.

When do you need to A/B test?

Consistent A/B testing provides valuable insights into customer behaviour, informing future marketing and product development strategies.

  • Launching a new product or service (to understand customer preferences)

  • Experiencing lower-than-expected open rates, click rates, and conversion rates

  • Looking to increase engagement with under-performing campaigns

  • Growing your subscriber list (by A/B testing pop-up variations)

  • Before making major changes to your website or strategy, validate the direction of those changes.

Consistent A/B testing provides valuable insights into customer behaviour, informing future marketing and product development strategies.

How do you come up with a testing strategy that works?

Creating an effective A/B testing plan involves several steps. This is how we recommend breaking them down to form a strategy that works for you or your client. 

  • Identify clear objectives, ie. what you want to achieve from your tests, whether it's increasing email open rates, placed order rates, or click rates.

  • Select one, and only one, variable to test at a time, such as the subject line of an email or the colour of a CTA button, to isolate its effect on the outcome. For instance, if you explore the colour of the CTA and also the length of your email and see a spike in form submit rates, you won’t know if the spike was due to the change in colour or the copy length. By testing one variable at a time, you can be sure you understand what resonates most with your audience.

  • Create two versions of your asset — the original version (control) and the modified version (test)—before segmenting your audience and making sure your test is being sent to a random, representative sample of your list to avoid bias. 

  • Test a larger audience. Consider this scenario: You create a form and test it using A/B testing. On the first day, the first 10 people who see the form click through and make a purchase. However, over the next week, the next 200 people exit out of the form immediately. If you stopped the test after the first day, you might believe that your form was successful. But by waiting until your results are statistically significant or until you have a sufficient sample size of viewers, one large enough to draw conclusions, you can accurately determine which form is better for your brand. 

  • Now, run the test, deploying both versions simultaneously to measure their performance over a longer period (unless the element you’re testing is sending time).  

  • Use statistical analysis to determine which version performed better and why, and apply these findings to optimise your digital assets. Continue testing other elements to improve your results.

Testing Strategies to Improve Open Rates, Click-throughs, and Conversions

In A/B testing, virtually any element that influences user behaviour can be tested with the same goal of optimising KPIs. However, some key elements are particularly impactful:

Subject Lines

Testing different subject lines can help increase open rates by capturing the recipient's interest. Key strategies include keeping them concise, curiosity-invoking, personalised where possible, clearly offering value, and utilising urgency. Some elements to consider here are the use of emojis, character length, including preview text, format (statement vs. question-form), personalisation (from name, e.g., your brand name compared to a more personal from name, like “Mark at Brand XYZ”) and the special offer.

Example:

  • Test personalised subject lines, "[Name], your customised deals for May are here!📢 " against more generic ones: "Check out our deals for May inside!" 

  • Or test the impact of highlighting the discount "Just For You: Get 30% off on all items!" against a no-discount approach like “Save Big This Weekend!

These tests can help you pinpoint what drives higher open rates for your audience. Klaviyo's A/B testing feature, powered by Klaviyo AI lets you test various subject lines, content, and send times effortlessly. This helps you gain insights into what resonates most with your audience.

Image source: Klaviyo

Content and Layout

To enhance engagement and click-through rates, it's important to experiment with different content and layout strategies. By varying the length and style of your body copy—testing short versus long forms, storytelling versus direct approaches, and emotional versus rational appeals—you can identify which form of messaging resonates most with your audience. Exploring the design aspects of your emails, such as using image-heavy versus text-heavy layouts, experimenting with different colours and shapes for CTA buttons, and choosing between dynamic blocks versus integrated designs, will help you determine the optimal visual appeal. Testing variations like CTA hyperlinks versus buttons can also reveal the most effective methods for prompting user action. These targeted A/B tests aim to pinpoint the content and design elements that drive higher click rates and conversions, providing a clear path to improving campaign performance.

If you’re using Klaviyo, learn how to A/B test campaign emails here.

Copy Length

If you’re announcing a new product or testing your Welcome Series, for example, depending on your audience, the complexity of your product, and your email design, it can be hard to tell if you’re better off with long lists with product details or brand information or short-form copy with a great hero and standout CTA.

Here’s a guide on how to A/B test a flow email in Klaviyo.

CTA Buttons

Altering the text, colour, size, or placement of CTA buttons (e.g., near the top of your email or lower down) affects conversion rates by making the desired action more compelling. Need help perfecting your CTAs? Check out our previous blog here

Images

You can test different images vs. no images at all to see how your audience responds and whether they take action. You can test different hero images to see which is most effective, gifs vs. static graphics, videos vs. infographics, etc. By systematically testing each variant against another, you'll gradually gain insights into what yields the most favourable results, allowing you to refine your approach one variable at a time. 

Send Times

Varying the send time of emails can help identify when recipients are most likely to open and engage with content, maximising impact. You can test morning against evening send times, weekends vs weekdays, 1 hour after or 30 minutes after a recipient has carried out a desired action (for automated flows like browse abandonment and post-purchase) and observe how open and click rates are impacted. 

Personalisation

A/B testing helps identify the most effective personalisation strategies. Personalisation is the key to taking your email campaigns to the next level in 2024. Make your customers feel important and seen, and engagement should soar. You can test whether including your recipients’ first names in the subject line or email copy has an effect on open and conversion rates, or if personalising with previous purchase information increases engagement. There is also the "From [Name]" tactic to introduce a sender and add an extra dimension of personality to your emails. Or, to take things one step further, consider making use of Klaviyo’s Show/Hide template blocks feature to display (or conceal) regional-specific content around local holidays or promotions, like special Mother's Day/ Father’s Day offers for customers celebrating this holiday. If you observe a decline in engagement metrics, you can also experiment with different email sign-offs, such as having messages come directly from your company's founder, especially when conveying thanks or exclusive offers. These variations allow you to observe what works best with your audience and refine your personalisation strategy accordingly.

Klaviyo's Show/Hide Logic feature displays specific blocks or sections to selected email recipients, enabling tailored content based on subscriber information for a highly personalised marketing experience.

Offers

Different types of promotional offers resonate with different audiences. You can A/B test whether a percentage discount (e.g., 20% off) drives more conversions compared to a monetary discount (e.g., £10 off orders over £50), or even free shipping. You can also experiment with gift-with-purchase promotions versus bundle deals to determine which incentivises higher average order values. These tests help pinpoint the most compelling offers in your subscribers’ eyes. 


Final Thoughts

Data-driven decisions make the difference between good and great marketing and A/B testing is a cornerstone in the pursuit of growth and efficiency in email marketing. It lets you show variations to a subset of your audience to see what drives better engagement and what makes them more likely to take your desired action. However, navigating the complexities of it, from planning and executing to analysing and implementing changes, is easier said than done.

This is where we come in. At Melusine Studio, we specialise in helping businesses like yours fine-tune their optimisation strategies with advanced A/B testing. If you're ready to supercharge your campaigns, or if you have any questions about how to get started, book a call with our team today.

Previous
Previous

Crafting Killer Email Pop-Ups: A Guide to Capturing and Converting Your Audience 

Next
Next

SMS Marketing 101: Proven Strategies for Boosting Engagement with SMS