Skip to main content
All CollectionsSendBroadcasts
How to A/B test subject lines
How to A/B test subject lines

How to A/B test your Broadcast subject lines, the reports provided, and answers to frequently asked questions.

Updated over a week ago

Knowing your subscribers is a key ingredient to effective email marketing.

A/B testing is a great way of gaining this knowledge! In ConvertKit, you can test different subject lines for your Broadcasts, and we'll automatically determine the winner based on their respective open rates.

How does A/B testing work in ConvertKit?

A/B testing is optional and allows you to test two Broadcasts subject lines with subsets of your recipients. We'll then automatically send the winning variant to the remaining recipients.

The process

  • We'll send each subject line to 15% of your recipients (i.e., 30% of your total recipients)

  • After a four-hour testing period, we'll determine the winning subject line based on which one resulted in the higher open rate

  • We'll then automatically send the winning variant to the remaining 70% of your recipients (i.e., those not involved in the initial test)

☝️ NOTE: Make sure you take into consideration the four-hour test period before running an A/B test! It might not make sense to run A/B tests on time-sensitive Broadcasts where you can't afford to wait four hours for all your emails to be sent.

How to set up an A/B test

To set up an A/B test for a Broadcast, click the A/B symbol next to the subject line in the Broadcast editor:

This will allow you to input your two subject line variants for this email, which will be tested against one another to see which has a higher open rate:

Adding two subject line variants is all you have to do to set up an A/B test! You can then send the Broadcast as normal and the test will happen automatically—no further action is required on your part (except checking back to learn the winner, if you're curious!).

Who should (and shouldn't) A/B test?

We recommend A/B testing only emails going out to 1,000 recipients or more. A test run on a smaller number will not give actionable data as the results will not be statistically significant.

For example: if you send your Broadcast to 100 recipients, then each subject line variant would go out to only 15 people (15% of recipients) for the initial test. Even if your email has a high open rate of ~50%, this still only means seven or eight subscribers will likely open each variant.

The result? That "winning" subject line could be determined by the whims of a single person. One individual's decision to open an email (or not) will not provide meaningful insights for applying to your greater list.

A/B test reporting

If you run an A/B test for a Broadcast, its report's Overview page will show the aggregate stats for all Broadcast recipients (including those who received the test variants):

NOTE: Aggregate stats won't be available:

  • During the four-hour testing period. That's because the winner hasn't been sent to the rest of your recipients list yet.

  • If you cancel the A/B test before your email has finished sending. In this case, only the A/B test results for each variant will be available.

And you can see the A/B test results for each variant on the Analytics page:


Where in ConvertKit is A/B testing available?

A/B testing is available only for Broadcast subject lines at this time.

Can I manually end an A/B test early?

Yes! To do so, go to the Broadcast's Analytics report.

Click the three vertical dots on the right, followed by Cancel A/B test.

What happens if I cancel my A/B test?

If you cancel your A/B test before it completes sending, your Broadcast will have been sent to only your test recipients (i.e., 30% of the recipient list).

To send the Broadcast to the remaining 70% of the recipient list, click the Recipients page to view the 30% of subscribers who have already received the Broadcast.

Select all of these subscribers, and add a temporary Tag (e.g., a Tag called "Cancelled A/B test") to them.

Duplicate the Broadcast. Then, send it to your initial recipients list (i.e., all 100% of them) except for the subscribers who have the temporary Tag you just created.

Feel free to delete the temporary Tag after that.

Why does the winner of my A/B test have a lower open rate?

At an A/B test's four-hour mark, our system will automatically send out the subject line variant with the higher open rate at that time to your remaining recipients.

That winner will receive a "Winner" badge in the Broadcast's reports:

After the initial four-hour testing period, the original test recipients can (and likely will!) continue to open the original two test variants. The isolated stats for each test variant will continue to be updated even after the test is over and a winner has been determined.

In some cases, what was the losing variant at the four-hour mark can end up overtaking the winner later on.

Once the testing period is over, the original winner (i.e., the one that had the higher open rate at the four-hour mark) will have already been sent out to the remaining recipients. This winner will continue to display the "Winner" badge even if the other variant overtakes it later on.

Why does my A/B test winner have a lower click rate?

Click rates do not factor into A/B testing—only open rates.

We display the click rate per variant for your reference. However, it will not factor into which subject line wins the test.

Did this answer your question?