7 Misconceptions About Email A/B Testing

Published: August 12, 2024

Few email marketing initiatives are as important — or as misunderstood — as A/B testing. Over the years, we’ve guided brands with A/B testing initiatives that help drive growth and insights, educating and empowering our clients along the way. In that spirit, let’s identify and correct the most persistent misconceptions about A/B testing in email campaigns.

Misconception 1: It’s All About The Subject Line

For lots of marketers, “A/B email testing” starts and stops at the subject line. The truth is that there’s several variables you can and should test, such as:

  • Open rates at different times of the day and week, along with varying subject lines; and
  • Post-open engagement and conversion factors, such as body copy, headline copy, offer/promo, CTA, background colors or the order of content blocks.

You should make a point of updating your list periodically with other elements to test. The more you test, the more you learn.

One last note on this subject: Changes in open rate are relatively murky post-iOS 15, which is even more reason to expand the scope of variables to test.

Get the latest B2B Marketing News & Trends delivered directly to your inbox!

Misconception 2: You Can Test Everything, Everywhere, All At Once

Clean A/B tests involve one variable, not many: Don’t switch up a subject line and time of send and day of week, or you won’t know which one(s) made a difference. Beyond that, they require focus on specific elements within that variable.

Since people love to talk about subject lines, let’s use that as an example. Wildly different subject lines will perform differently, but you won’t understand why. On the other hand, if you’re testing specific differences like emojis versus no emojis, question versus statement, etc., you’ll have a clear takeaway from the data.

Misconception 3: Instant Wins Abound

One of the biggest misconceptions is that each A/B test might just produce a silver bullet. In fact, A/B testing results are best seen in aggregate, over the long term. If you’re structuring your tests well and getting information to optimize future sends, the wins will mount steadily over time.

One reason you shouldn’t expect instant results is that the audience for the test will be split in half: Even if one version knocks the other out of the park, 50% of your audience will still be underperforming in comparison.

Last thing here: Depending on the test, you might not find any kind of winner. Inconclusive findings are fairly common, and it doesn’t mean the test was flawed. Instead, it simply means the variable you introduced — e.g., changing the copy format to bullet points — didn’t move the needle one way or the other.

Misconception 4: Once A Winner, Always A Winner

A winning test from six months ago might not be a winner if the same test were run today. That’s not to say you should relaunch tests in rigid six-month intervals, but you should keep an eye on any winning campaigns to make sure their performance remains strong. If you see engagement trending steadily down over time (rather than witnessing one bad send and over-reacting), it might be your signal to take another look.

Misconception 5: The Truth Is Universal

We love a good best practice at DMi, but we also know there are exceptions to every rule that you won’t know until you test. For instance, for one client, we personalized a CTA to reflect the person’s location (e.g. “Find businesses in {city}!”), and performance was far lower than the email with the non-personalized CTA.

In other words, work with guidelines — not assumptions —and be prepared for some surprises along the way.

Misconception 6: A/B tests Are Always A Good Idea

Speaking of universal truths, while we love A/B tests, they aren’t always a good strategic move! If you need to push out an email for the sake of engagement and performance, put your best foot forward (informed by all the learnings from past A/B tests, of course) rather than segmenting your audience and risking a less-than-optimal performance from 50% of the list.

Misconception 7: There’s No Value In Failure

If you test a great idea that falls flat, it can be hard on the ego — but it’s also important knowledge that you need to file away for reference. Remember that the aggregate value of A/B testing comes in the insights you can apply over time, and you’ll reframe “failure” as a valuable piece of information.

How important is it to get A/B email testing right? Well, as media costs soar and first-party data usage becomes even more important, email marketing has a very important role in holistic marketing strategies. Great A/B testing that helps you raise the levels of your email engagement over the long term is one of the best investments you can make for your marketing growth.


Lauren McGrath is an Associate Email Strategy & Execution Manager at digital marketing agency DMi Partners, which she joined in 2021. Lauren, a New Jersey native who earned a bachelor’s degree in fashion merchandising and an MBA from Thomas Jefferson University, builds and optimizes email campaign execution processes for a portfolio of brands across B2C, CPG and B2B verticals. She began her email marketing career in 2018 and is passionate about raising the bar for email performance standards across the marketing industry.

 

 

 

 

Posted in: Demanding Views

Tagged with: DMi Partners

ADVERTISEMENT
ADVERTISEMENT
B2B Marketing Exchange
B2B Marketing Exchange East
Campaign Optimization Series
Buyer Insights & Intelligence Series
Strategy & Planning Series