A Practical Guide for Email A/B Testing

Developing a routine testing program is a critical factor for optimizing the effectiveness of your email marketing program.

Our audience of educators has a high level of concern for unsolicited emails. Like most professionals, their inboxes fill daily with appeals of all sorts. The more tech savvy the educator, the more likely they are using filters or other email management tools to control the flow of inbound communication.

A systematic A/B testing program can increase response to your emails.

A/B testing has been with us since the early days of direct mail and is a practice that is directly applicable to email as well as easier to do and less costly. Basically, you divide your audience into two randomized segments and test one variable (and only one variable) at a time to determine which strategy generates a greater response.

You can use A/B testing on every variable in your email such as:

  • The audience – job title, geography and other demographics
  • HTML vs. text
  • Deployment timing – day of the week and time of day
  • The offer
  • Subject lines
  • Headlines
  • Length of content
  • Personalization
  • Design, including number of images, buttons vs. links, etc.
  • Call to action

Although you should only test one variable at a time, a variable such as personalization can be tested in several ways such as in the subject line or the salutation. In the Science of Email Marketing Report 2014, Hubspot marketers note that using a first name in email increases click through from 5.8% to 7%. The report writers conclude that when using personalization, it appears to the recipient that you are “trying to solve your reader’s problems rather than just your own.”

It might also be worthwhile to test the recipient’s first name in the subject line to determine if this changes response either positively or negatively.

Every email test result should bring you closer to understanding your audience. The email marketing report mentioned above also found that while the volume of emails is lower on the weekends, the open rate is significantly higher. This may or may not be true for your audience, but it is worth testing.

Design elements are also important to test. Do emails with images get opened more frequently? Are there more click throughs? Which tests better – a button or link? What about the size, color, and type of headlines inside the email?

What about long subject lines vs. short ones? When most people are scanning their email, which pulls better for you – shorter or longer text? What about the tone of the text – more personal or professional?

It’s likely that your audience will respond at a higher rate to short, clear, direct calls to action. They want to know what you expect them to do. Be very clear in asking them to take one action. But you should also test the language of your call to action to see what works best for your offers.

You don’t need a complex database program to manage an A/B testing program. A simple spreadsheet will do. But testing should be a part of every campaign. If you can schedule some testing prior to the launch of a major campaign, you help your chances of increasing response.

The benefits of routine A/B testing will contribute to the effectiveness of your email campaigns. Every time you email your audience, you should learn something new about them.

Skip to content