Configuring and sending a campaign that's using split testing

Contents

Configuring your split test options
   » Selecting the ROI metric
Saving and sending your split test
Split test reporting
   » Filtering by split test variation

Summary

If your campaign is using split testing to test multiple subject lines, friendly from names, creative and from addresses in order to see which work best for you, then you'll need to configure your split test options prior to sending. These options comprise of three key components:

  • the sample percentage of your selected contacts that you'd like to send the split test to
  • the amount of hours over which you'd like to measure responses
  • the metric you'd like to use to record responses by and thus judge a winner with (open, clicks, or *ROI data)
*To be able to use the ROI data metric, you will need to have ROI tracking and ROI markers set up in your account prior to configuring and sending a split test campaign.

Configuring your split test options

Step 4 Contacts of the campaign creation steps, which is the 'Select contacts and schedule campaign' screen, will enable you to set your split test options:

split_test_set_up.png

You can access the 'Select contacts and schedule campaign' screen in the following ways:

  1. If you have closed your campaign, select Campaigns from the navigation bar and then click on the Send icon against the campaign you want to send.
  2. If you are within the campaign creation step process (this is the case if the steps are displayed in the top right corner of your screen), click on step 4 Contacts.
  3. Alternatively, you will naturally arrive at this screen if you keep continuing through the campaign creation steps after creating your campaign content.

From the 'Select contacts and schedule campaign' screen, select one or more address books and/or segments to send the campaign to:

select_the_contacts_to_send_this_campaign_to.png

You now need to specify when you want the campaign sent:

select_when_you_would_like_to_send_the_campaign.png

Check the Immediately box if you want the campaign to go now or set a date and time for it to be sent in the future by checking the Scheduled box.

As you've selected the option to split test different variables in your campaign, you now need to complete the split testing configuration area of the screen:

set_your_split_testing_options.png

Contact percentage to receive split test: Enter a percentage relating to how many of your contacts you wish to use for the split test. The default setting is 5%. The number of contacts selected here will be split across the variations of the campaign you are testing (e.g. if you have selected 10% and have a split test campaign A and a split test campaign B, then 5% will be sent A and 5% will be sent B).

Time delay: Enter the amount of hours you want the responses measured for. The default setting is five hours.

In our example, we have used five hours and this means that after five hours the rest of our contacts will be sent the strongest performing campaign, this being the winner of the split test.

External dynamic content may cause sending delays

If your split test campaign is making use of external dynamic content, then this may cause delays in completing the split test send in the way you've configured it here. This is because each external dynamic content block performs an HTTP request, which can generally be slow to make. If you have a lot of external dynamic content blocks in your campaign then this could significantly reduce its send speed, and thus this delay may exceed your configured time delay.

Metric to be used: Select whether you wish to measure opens, clicks or ROI markers when recording the responses. As mentioned above, the ROI metric will be available as long as you've already set up ROI tracking and ROI markers in your account.

split_test_metric.png

Selecting the ROI metric

There are some additional steps in configuration when selecting the ROI metric.

You'll be prompted to select one of your markers from the dropdown menu. 

select_ROI_marker.png

Which one you choose and its data type (text, number, date or 'yes/no') will dictate your next set of marker metric options. For instance, a marker with a number data type will provide you with a choice of hits, total and average. This is because a marker with a number data type can not only be hit upon by a visitor to your website page but the numeric value included with the marker means a total and average can be worked out for each split test versions. In the case of CheckOutAmount, a winner can be decided upon the total amount of spend for all customers and the average spend across all customers.

Anything other than a number data type for a marker will give the default choice of the hits metric only, as nothing further can be calculated for data types of this nature.

select_marker_metric.png

When clicking on a different metric option, a helpful box appears providing further information on what the metric means and what it's useful for.

Saving and sending your split test

When satisfied with your configurations, click on the Save & continue button. The campaign settings are displayed for you to review. Each set of variables are shown as a different summary tab:

campaign_summary_screen.png

When ready, click on either the Send campaign immediately or Confirm scheduled campaign button at the bottom of the screen, depending on what you've chosen to do previously. A final warning confirmation is displayed:

final_prompt.png

Clicking on the Confirm button will execute the campaign.

The campaign is moved to the Outbox while processing occurs and this is shown by the indicator displayed under the Status column. Once completed, the campaign will appear under the Sent tab.

Split test reporting

Once your split test has been sent on its way, you can click on Reporting.

If your split test is in the outbox, you will be informed how long remains before your split testing period has finished in the reporting header for your campaign. Otherwise, if it has completed, the winning split test campaign version and its associated metric (opens, clicks or ROI) will be displayed, expressed as a percentage.

split_test_winner_reporting.png

If you have used an ROI marker as your metric, your winning split test version will be similarly indicated in the reporting header, again expressed as a percentage.

Filtering by split test variation

To filter on split test variations, click on Filter:, select Split test variation, choose either A, B, C, etc., from the dropdown and then click Apply filter. The report will indicate that it's refreshing before updating with the relevant statistics.

split_test_filter_on_reporting2.png

You may also find the following articles useful:

Have more questions? Submit a request

Comments

  • Avatar

    If you choose clicks as the metric, is this total clicks or unique clicks?

  • Avatar

    The clicks metric is measured on unique clicks.

  • Avatar

    The % that you get from A and B are these just from the split send or for the complete send?

  • Avatar

    Hi Jo,

    Yes, these percentages for A and B in the reporting header relate solely to the split test part of the send. They indicate the result of the split test, based upon the metric you set, collected over the period of time that you set. After the split test period is over, the winning version will then be sent to the remainder of contacts in the address book/s.

  • Avatar

    Hi Neal,

    We use our own custom unsubscribe and updates preferences links. If I click "do not track" presumably these will not be counted in any A/B split tests? I don't want to have the winning version be the one where more people are clicking unsubscribe.

    Helen

     

  • Avatar

    Hi Helen,

    Yes, I can confirm that's correct. Marking links as 'Do not track this link' will mean any clicks on them aren't counted at all towards an A/B split test.

  • Avatar

    Hi again,

    We were were wondering if untracked links are counted in the total unique clicks? Or would they be removed from all figures?

    Helen & Seb

  • Avatar

    Hi Helen,

    A click on an untracked link isn't counted as a unique user click through or as a link click in reporting. It ignores them.

  • Avatar

    If both campaigns are equally winners, how does the system determine which one to send out?

  • Avatar

    Hello is there a recommended percentage to receive the split test and also a recommended wait time? I'm not provided with ROI as an option for measuring - only click or open - why would this be?

  • Avatar

    Hi Marianne,

    Thanks for your questions.

    You won't see the ROI metric if you haven't already set up ROI tracking and ROI markers in your account. This article explains how - https://support.dotmailer.com/hc/en-gb/articles/212212388-Using-site-and-ROI-tracking.

    These are really good questions about recommended percentage and waiting time! Whilst I can advise in very general terms in the interim, I am seeking some specific best practice advice for you from our experts in this area, so I'll post again here when I have that to hand.

    As you'll see, the default percentage limit is set at 5% and you might want to try that to begin with (although this is really the bit of best practice advice I'm keen to get for you). As for length of testing time (and again this is fairly general), I'd say for as long as you have the luxury to test for, as this gives your test group the best chance to see, open and click on your email. In turn this will give you a useful result set to base the sending of the winning version on. If you're pushed for time, then two hours is better than one hour, for the same reason.

    Hope that helps, but as I've said I hope to come back to you with more when I can.

  • Avatar

    Thanks Neal - this is a good starting point for us and I'll stay tuned for the rest.

Powered by Zendesk