In this article, you'll learn how to create variants of a campaign that you can test against a control campaign. For some examples of the changes you might consider for the type of variant you're creating, see Campaign Variant Types.
TIPS
To generate the most conclusive results and to limit the effort required to assess experiment results, limit the number of variants you create—one or two should be enough to test your hypothesis. We also recommend that you test one hypothesis at a time.
If you intend to make changes to the control (original) campaign you're experimenting with, do so before you create experiment variants. Any changes you make to a control campaign after creating variants for an experiment are not reflected in the variants.
This article focuses on email campaigns, but variants are created the same way for other campaign types, with different fields to experiment with.
Table of contents
Creating variants for from name, subject line, or preheader
The steps involved in creating a variant that tests opens and open rates is the same whether you're testing from name and sender, subject line, or preheader, you just specify values for different fields.
-
On the Create variations page, edit the field(s) that match the Experiment with value you chose during setup.
For example, maybe you want to create a variant of a subject line that includes an emoji.
If you want to add another variant, click Add new variation.
When you're done creating variants, click Save & Review.
Be sure to review all variants—you want to avoid changing an experiment after you launch it, as doing so may cause inconsistent results.
Click Save & Continue.
-
Launch the campaign variants for the experiment.
If you're creating an experiment from a new campaign, when you return to the Review & Launch page:
For a blast campaign, schedule the campaign so the experiment automatically sends with it.
For a triggered or journey campaign, activate the campaign and then manually launch the associated experiment.
If you're creating an experiment for an active campaign (blast or triggered), the experiment will send the next time the campaign sends.
NOTES
For a journey campaign, you can alternatively let a campaign visitor automatically activate the campaign after you enable the journey.
For recurring campaigns, experiments are associated only with the first instance of a campaign, and are not relaunched with each recurring send.
Creating send time variants
You've already specified a send time for the control campaign. Now, specify send times for each variant you create.
On the Create variations page, edit Variation name if you want it to reflect the type of change you're making.
-
Specify Send date and Send time for this variant.
IMPORTANT
If your campaign's send lists have highly dynamic membership, launch the send time experiment as close to your campaign's first send as possible. This way, the list's membership at the experiment's start will be similar to what they were at the start of the campaign.
-
If you want to add another variant, click Add new variation.
Schedule any additional variants at least one hour ahead of the one you created before it.
When you're done creating variants, click Save & Review.
Click Save & Continue.
-
Launch the campaign variants for the experiment.
If your campaign isn't yet scheduled or activated, do so now. Experiments for scheduled (blast) campaigns will send with the campaign, but experiments for recently activated triggered campaigns must be manually activated.
Experiments for recurring campaigns are associated only with the first instance of a campaign, and are not relaunched with each recurring send.
Creating variants of a message body
Experiment with the message body to measure the effectiveness of a campaign (and its variants) on clicks, purchases, or custom events.
For example, maybe you want to experiment with a winback campaign that currently offers churned users a 25% discount on their next purchase. Perhaps you want to see whether they convert more readily when you highlight what they've missed while they've been gone.
-
On the Create variation page, at the bottom of the page, click Save Experiment to enable editing.
Click Edit Variation at the bottom of the page to open the template editor, and click Edit Design.
-
Make the changes you want to test, and click Save Design & Exit.
From the Create variation page, click Save & Review.
Click Save & Continue.
-
Launch the campaign variants for the experiment.
If your campaign isn't yet scheduled or activated, do so now. Experiments for scheduled (blast) campaigns will send with the campaign, but experiments for recently activated triggered campaigns must be manually activated.
Experiments for recurring campaigns are associated only with the first instance of a campaign, and are not relaunched with each recurring send.
TIP
To create a variant of a campaign that tests large-scale changes, try breaking them into discrete experiments. For example, create one experiment to test messaging, and another to test a new layout. This way, you can evaluate which change has the greatest impact.
Creating a Send Time Optimization test
To test whether a blast campaign has a better open rate with Send Time Optimization enabled (compared to the campaign's scheduled send time), create an STO experiment.
NOTE
You can't disable Send Time Optimization after a campaign starts sending.
To associate an STO experiment with a new campaign, after you attach a template and update its content, in Conversions and Experiments, click Edit and then Add Experiment.
-
On the Experiment Setup page, choose Send Time Optimization and click Create.
If Send Time Optimization isn't visible, your Iterable project doesn't yet have enough historical user engagement data to calculate optimized send times.
From the Create variation page, click Review, and in the preview window, click Save & Continue.
-
On the Review & Launch page, make sure that Enable Send Time Optimization is checked.
Specify the maximum number of hours (between 6 and 24) after the campaign's configured send time (whether immediate or scheduled) that Send Time Optimization can send messages. STO optimizes send times at a one-hour granularity and sends messages at the top of the hour.
WARNING
If you add a rate limit to your campaign, some messages may send outside of your STO window. Rate limits are a beta feature; talk to your Iterable customer success manager to add them to your account.
-
Schedule the campaign to send now or at a specific time.
NOTE
Specifying a recurrence pattern for a Send Time Optimization campaign causes each recurrence to use Send Time Optimization. However, STO experiments do not recur.
Holdout group tests
It's not possible to create a standalone holdout group experiment. Instead, create one of the types of experiments mentioned in this article, and when you're setting it up, add a holdout group. See Setting up an Experiment
You don't have to add a variant to a holdout experiment if you don't need one. If you're only interested in measuring campaign conversions and holdout group conversions, simply skip the variant setup step. When your experiment is launched, it will be sent to the users who are not part of the holdout group.
NOTE
For any given experiment, you can create a single holdout group.
WARNING
If you're testing out Iterable's holdout groups functionality to get a
sense of how it works, and you make an API call to track a custom conversion
event for one of the holdout group's members, do not include a campaignId
in
the API request body. Iterable uses attribution periods to track holdout group
conversions. For a real campaign, a holdout group member wouldn't have received
the message, and therefore wouldn't have a campaign ID to associate with their
conversion event. If you do include a campaignId
for such a user, their
conversion won't be associated with the holdout group.
Next steps
After an experiment launches, you can monitor the results on the Experiments index page. To learn more, read Managing Experiments for information about the statuses reported. And, see Experiment Winner Selection to learn when and how Iterable selects a winning campaign variant and what happens after a winner is selected.
Comments
0 comments
Article is closed for comments.