How an Effective A/B Testing Strategy Improved Conversion Rates by 37%

image

Wishpond is happy to publish this guest post from Jacob Baadsgaard.

No matter how good you are at marketing, there’s always room for improvement. This is especially true when it comes to landing pages.

There’s no set formula for creating a high-converting landing page. To get a great conversion rate, you need to answer a lot of questions.

Does your hero shot sit right with your audience? Is your call-to-action eye-catching enough? Is your copy too long…or too short? Do you simply have the wrong traffic?

The possible explanations for a lousy conversion rate are endless.

However, with an effective A/B testing strategy, figuring out what works for your audience and using that to increase your conversion rates and sales can be a fairly straightforward process.


What a Great Testing Strategy Looks Like


To show how a great testing strategy produces results, let’s take a look at a recent test we ran at Disruptive that increased a client’s conversion rate by 37%.

With one test.

At first glance, this seems like exactly the opposite of what I just said about needing an awesome testing strategy to produce awesome results. However, we didn’t pull this 37% improvement out of a hat.

Instead, it was the culmination of a series of tests that each provided new insight into our target audience.

Prior to this successful test, we had run numerous unsuccessful or partially successful tests on the page. Those tests had generated minor improvements in conversion rate, but we still hadn’t produced the kind of results we wanted.

As disappointing as our tests had been in ROI terms, they hadn’t been in vain. With every test, we had learned more about our target audience.

We knew we were marketing to millennials who wanted a seamless online experience. They were internet-savvy, but also a bit skeptical of overly pushy sales tactics.

Most importantly, they wanted to be in control of their online experience.

Considering what we knew about our audience, we decided to try a “push-pull” strategy. So, instead of making the form the first thing they saw on the page, we switched to using a lightbox popup form that only appeared when the user clicked a CTA button.

As a result, our users had to click the button to get at the form and—when they did—they were rewarded with our lightbox popup.

Here are the two pages:

It was a simple idea, but it dramatically improved the conversion rate of our page.

Now, instead of trying to convince our audience to fill out our form, we were letting our audience sell themselves and ask for the form—which was exactly what they wanted.

With this new strategy in play, our conversion rate jumped from 17.65% to 28.13%—producing 100 additional sales and $43,017 in additional revenue in just 3 months!

All of those extra sales came at no added marketing cost, so this one test increased the client’s return-on-ad-spend by 88%. Our testing strategy had finally paid off.


Audiences are Unique


At first, we thought we had found the holy grail of conversion rate optimization. We tried lightbox popups with a variety of clients in the hopes of seeing across-the-board improvements.

Unfortunately, it didn’t work out that way.

One client’s conversion rate improved by 15%. For two other clients, their conversion rate stayed about the same. Another client’s conversion rate dropped by 18%!

What happened?

Unfortunately, in our excitement over our testing win, we had overlooked what made our lightbox test successful in the first place—lightboxes were a great fit for our millennial audience.

Apparently, lightbox popups weren’t necessarily a great fit for other audiences.

This sort of over-generalization is a common problem for marketers. In the search for the next marketing silver bullet, it can be easy to read an A/B testing case study and assume that you can use data from someone else’s tests to shortcut your way to conversion rate nirvana.

The problem is, while case studies are a great way to come up with testing ideas, you can’t assume that what worked for someone else’s audience will be a perfect fit for your market.

How’s the old saying go?

It’s as true for A/B testing as it is for anything else.

The key to optimizing your landing page conversion rate isn’t trying someone else’s ideas—it’s getting to know your market and creating pages that work for your unique audience.


How to Set Up a Great Testing Strategy


The secret to successful landing page testing is creating a great testing strategy—one that teaches you something with every test.

Creating and using a great testing strategy takes planning and documentation, but it will save you a lot of time and allow you to increase your profitability in the long run.


Create Your Buyer Persona


To begin, you need to profile your buyer personas. At a minimum, you should know the following:

  • Who is your target audience?
  • How old are they?
  • Are they primarily of a specific gender?
  • What is their role?
  • What are their responsibilities?
  • What is their budget for your product or services?
  • What motivates them?
  • What is their pain point?
  • How does your product or offer resolve their problem?

To answer these questions, you may need to talk to your sales team or even interview some of your current customers. It will take some extra effort up front, but it can cut a lot of time off of your testing learning curve.


Create Your Hypotheses


Once you’ve profiled your buyer personas, use those profiles to build your first landing page variant. If your messaging and targeting isn’t on point for your audience, you won’t learn much from your tests, so make sure that your landing page is getting the right traffic!

With your first variant in hand, it’s time to use your buyer persona to come up with a hypothesis about how your target audience might interact with your page. Does your page fit all their needs? Are there additional ways that you could address the same needs?

Once you have a few hypotheses, build your testing strategy around those ideas.

For example, there might be a few different ways to write or present your CTA. Your first attempt might be a great starting point, but there’s probably a better way to present your CTA that you can identify with a few A/B tests.

To see how changing your CTA affects your conversion rate, you might put together a spreadsheet like this:

See how each test sets up the next test? You learn something from each iteration and then use that to guide your next test.

Plus, everything is thoroughly documented, so if anyone ever wonders why you made a certain choice, you’ve always got a handy reference!

A good landing page and a/b testing tool like Wishpond will document your results, which is helpful, but if you don’t document the thinking behind the test, the results won’t do you much good.


Conclusion


Successful landing page A/B tests are rarely the result of a lucky break. If you want great conversion rates from your page, you need a great testing strategy that equips you with a deep understanding of your target audience.

Your conversion rate might not immediately improve by 37%, but with the right testing strategy in place, you’ll get there eventually.

You’ve heard my two cents, now I want to hear yours.

How have you seen an effective testing strategy improve conversion rates? Have you tried a lightbox popup before? How did it affect your page performance? How do you use case study data to guide your landing page tests?


About the author:

Jacob is a passionate entrepreneur on a mission to help businesses achieve online marketing success. As the Founder & CEO of Disruptive Advertising, Jacob has created an award-winning, world-class organization that has helped hundreds of businesses grow using pay-per-click advertising and conversion rate optimization.

amir

Leave a Reply

Your email address will not be published. Required fields are marked *