Once your A/B Experiment is up and running, it's often useful to jump in and take a look at the performance of the variations, regardless of whether Iterable is choosing the winner for you or not.
Experiment Metrics Table
When you click into an Experiment, you'll be taken to a page that looks something like this:
In this case, the Experiment was keeping track of the Opens metrics, so the Improvement and Confidence columns in the table reflect the Open Rate metric. Metrics will be shown side by side respective to each variation created in the Experiment.
If you'd like to display more metrics, use the drop down menu at the upper left of the table. From there you can choose metrics like "Total Sends", "Delivery Rate", "Click Rate", "Revenue" or others, depending on the medium you're testing (Email vs Push, for example).
If you'd like to view all metrics in a CSV file, use the Export Table to CSV button at the bottom right of the table.
Exporting All/Multiple Experiments
In some cases, you may want to pull historic experiment data across all experiments, or export multiple experiments at once.
To do that, use our API endpoint for /api/experiments/metrics, found in our support docs.