Once your A/B experiment is up and running, it's often useful to jump in and take a look at the performance of the variations, regardless of whether Iterable is choosing the winner for you or not.
#Experiment metrics table
When you click into an experiment, you'll be taken to a page that looks something like this:
In this case, the experiment was keeping track of the opens metrics, so the Improvement and Confidence columns in the table reflect the open rate metric. Metrics will be shown side-by-side respective to each variation created in the experiment.
If you'd like to display more metrics, use the drop down menu at the upper left of the table. From there you can choose metrics like Total Sends, Delivery Rate, Click Rate, Revenue or others, depending on the medium you're testing (email vs push, for example).
If you'd like to view all metrics in a CSV file, use the Export Table to CSV button at the bottom right of the table.
#Exporting all or multiple experiments
In some cases, you may want to pull historic experiment data across all experiments, or export multiple experiments at once.
To do that, use the
/experiments/metrics API endpoint.