The Highest-Impact Email A/B Tests We’ve Ever Run


I’m the luckiest kind of marketer – one for whom A/B tests reach significance quickly.

We have enough traffic to run tests and quickly see whether our changes improved things or sent us running to the developers at 2am.

It’s the same for our email campaigns. The Wishpond blog sends its newsletter to 171,000 recipients, which means the email optimization tests I run reach significance almost immediately.

Not every marketer is so lucky, so I want to share some of my luck around.

This article breaks down the most impactful subject line A/B tests we’ve run in the past few months. Tests you can copy for your own business’ newsletters and email campaigns.

Since August, what we’ve learned from these tests has improved our average open rates by 40%.

Newsletter A/B Test #1: Question vs Statement

The experts are divided on this subject.

For instance, Dan Zarella analyzed stated clearly that “People are less likely to open emails that include a question mark (?) or a hashtag (#).”

Constant Contact, however, states in no uncertain terms that “Using a question in your subject line is a great way to make a more personal connection with the people viewing your emails.”

So… let’s check it out.

email a/b test

Test “A” converted 26% better than Test “B” with 100% certainty.

Actionable Takeaway from this Email A/B Test:

Perhaps the question is too commonly-asked. Perhaps the strategies and examples are too desirable. But, for us, questions don’t result in higher newsletter open rates.

Your newsletter subscribers want to know what they’re going to get when they click on an email. Don’t beat around the bush with vague questions.

Want to see the impact of testing on your next newsletter?

Wishpond’s email marketing tool makes it super easy to set up A/B tests, as well as automate your campaigns. Click here to learn more and talk to an expert.

Email A/B Test #2: “Analyzed” vs “You can Copy”

This test sought to determine if our recipients responded better to the idea of marketing strategies/examples/best practices being “analyzed” or if they were more interested in the ability to copy those strategies/examples/best practices.

Here are the newsletter variations:

email a/b test

Test “B” converted 69% better than Test “A” with 100% significance.

Actionable Takeaway from this Newsletter A/B Test:

At the heart of it, this newsletter A/B test asks the question, “Does the modern marketer want to read about data being analyzed, or do they just want something they can take to the bank, today?”

This test, like the best email A/B tests, gives us something to take forward.

People want actionable strategies – strategies they can put on one monitor and put into action on the other.

Because of this test, we’ve started to move away from thought leadership, and toward actionable, copyable marketing tactics in our content and in our email strategy.

Newsletter A/B Test #3: Inserting company name

This test was only run to about 15% of our newsletter list, as we don’t have company name for every one of our subscribers. But, due to the high-impact test we ran, we still found 100% significance.

The test was designed to determine if the addition of a business’ company name in the email’s subject line would improve open rates.

Here are the two newsletter variations:

email a/b tests

Test “B” converted 16% better than Test “A” with 100% significance.

Actionable Takeaway from this Newsletter A/B Test:

This one doesn’t surprise me in the least. Every case study makes it clear that personalization increases email open rates.

The problem, of course, is that you don’t always have the personal details which make personalization possible.

So the actionable takeaway from this newsletter A/B test is to get those personal details. Ask your leads for whatever information will increase your email’s open rates and final conversions. Then use it to do so.

Newsletter A/B Test #4: Relevance vs Specificity

This test, for me, was all about proving to myself that it was better for us to be relevant (using “Black Friday”) than it was for us to deliver number-specific value.

As one of our most impactful newsletter A/B tests, the results were loud and clear.

Here are the two newsletter variations:

email a/b tests

Test “A” converted 64% better than Test “B” with 100% certainty.

Leave a Reply

Your email address will not be published. Required fields are marked *