A/B Split Testing for Newsletter
Maximize the success of your email campaigns
With A/B Split Testing you can compare different versions of your newsletter campaigns to optimize your email marketing: Recipients of your email campaign are randomly divided into two sample groups. Each group receives a distinct version of your newsletter.
- Your two test versions only differ in one detail (e.g. the subject line) and are identical in all other aspects.
- Test group A receives version A; test group B receives newsletter version B
- In the end, the recipients’ behavior (opens, clicks) is measured and evaluated – the more successful newsletter is optimized.
1. Variables you can test
A/B Split Testing is an experiment that allows you to test, analyze and optimize your newsletter campaigns. Instead of intensive guesswork, just do an A/B test to find out what works with your recipients.
There are lots of parameters that you can test:
- Subject line: This is the most important detail that decides whether your newsletter gets opened or deleted. Try long and short ones, funny and serious, personalized and generalized and so much more!
- Pre-header text: Some email clients as well as the mobile display show the pre-header – that’s why this is just as important as the subject line for high open rates. Try two different versions!
- Sender’s name: Company name, staff member or a mixture of both? Only first name or first and last name? There are no limits to the possibilities, but the right name can hugely influence your open rates!
- Varying send times: set different delivery times and days
- Layout: page design, structure, color and shape of the (call-to-action) buttons
- Visual elements: images, formats, font and background colors
- Addressing: personalized versus generalized
- Content: Headlines, order of topics, length, phrases, descriptions and calls to action
- Ads: orders free of shipping costs, coupons and discounts are strong click incentives
Tip: Only test one variable per A/B Split test – otherwise you won’t get clear and unambiguous results. If you want to test two different subject lines and the placement of your call-to-action buttons, for example, you’d have to run two A/B split tests.
2. Set test criteria
Select test criteria according to your goals and content of your mailing:
- Size of sample group
- Send times
- Duration of the test
- Select if there should be an automatic delivery at the end of the test
Or set goals like:
- Increase open rates
- Higher click rates
- URL (more unique clicks on an URL)
- Conversion rate (this is only possible in connection with Conversion Tracking)
3. Set up A/B Test
- Select the size of the sample groups, for example 10 percent of the list that should receive the campaign.
- Now create the two versions of your newsletter, depending on the criteria you want to test (e.g. subject lines, images or calls to action).
- Your sample groups are subdivided randomly and recipients don’t know that they are part of a test group. In our example with 20,000 recipients, 2,500 receive version A and 2,500 receive version B of the mailing.
- After the duration of the test (which you set before you start the test), the winning version will be determined. If, for example, you want to identify the more successful subject line through open rates, it will be evaluated after the test, which group shows more unique opens.
- The winner version will then (if you have selected this option) be sent automatically to the rest of your recipients.
4. Which test is the right one?
To get useful results for an A/B test, you have to set clear goals first. What figures do you want to optimize? Which factors influence which figure in your newsletter?
- Improve open rates: various features can be tested to achieve higher open rates: subject line, sender’s name, delivery time (time and date)
- Increase click rates: Your newsletter has been opened, but how is your click rate influenced? Find out by varying these elements: images, text, color (style, amount, order), call-to-action buttons (placement, size, shape, color)
- More turnover through higher conversion: Which incentives generate higher turnovers? Try discounts and coupons, or offer no shipping costs and thank you presents
Tips for successful newsletter testing:
- Goal: formulate clear objectives: Do you want to increase open rates or conversions? Or are you more interested in getting more people to click on your links than anything else? Define from the start, which figures you want to improve
- Test parameters: Choose the right parameters for testing: depending on your objectives, there can be multiple variables that influence your parameter. These are juxtaposed in the two versions.
- Repetition: Do regular A/B tests. Things that work perfectly well today, can be a total disaster the next year. That’s why you should do regular tests from time to time.
- Send simultaneously: The delivery time can distort the results of the testing. That’s why you should send both your newsletter versions at the same time – unless the delivery time is the parameter to be tested.
- Sample group: choose your sample group wisely: Get help from the CleverReach® Newsletter Testing Tool to make sure your groups are selected randomly and are the right size to generate a representative result.
- Figures: Don’t forget about all the relevant figures in email marketing! Not only opens, clicks, and conversion rate play a role, but also bounces and unsubscribes, which can be tested and optimized with an A/B Split Test.
A/B Split Testing with CleverReach®
- up to 2,500 recipients
- and send up to 10,000 emails per month for free!
Our Lite Plan has no limited running-term. There is no setup fee and no contractual obligations.
For more emails and features, choose between our Flex, Essential or Enterprise plans. Our price plan calculator will tell you which plan is the best for you!