You can set up an A/B test to measure and compare the performance of different creative groups or targets. For example, you can allocate a portion of the existing budget to see if the new concept or theme of Creative Group B is more efficient than that of Creative Group A.
A/B tests aren't recommended for comparing the performance of different creative formats such as image and video. Given that ad placement varies by creative format, it is tricky to rule out post hoc the possibility that ad placement rather than creative format led to differences in performance.
Set up an A/B test
Important: For iOS campaigns, check to be sure that probabilistic attribution is enabled. A/B tests are unavailable for campaigns running only on unattributable or SKAdNetwork attributable only traffic.
You can set up A/B tests for individual campaigns using ad groups for those campaigns. Follow the steps below to set up an A/B test.
-
Select an ad account from the top left drop-down menu.
-
Select an app from the left sidebar.
-
Click a campaign title from the list of campaigns under Campaigns.
-
Scroll down to find A/B Test groups. You can see the list of A/B tests created so far.
-
Click Create. This takes you to New A/B Test where you can configure settings for the new A/B test.
-
Enter a title for the A/B test. Description is optional.
-
For Variable, select Creative Group or Target from the drop-down menu.
-
Under Test Groups, select an ad group as the control group. Depending on the variable selected, you will see the list of creative groups or targets in the ad group to be used for the A/B test.
Important: After assigning a control group, you can't change the creative groups or targets assigned to the control group.
-
Select one or more ad groups as test groups. You can use the + Add Test Group button to add up to 10 test groups. For each test group, you can see the status and assigned creatives when Creative Group has been selected for Variable, and target settings for the test group when Target has been selected for Variable. When creative group has been selected as the variable, targets are randomly divided among the control and test groups to avoid overlap. When target has been selected as the variable, creative groups are randomly divided among the control and test groups. Be sure that you have created the ad groups to test in advance of setting up the A/B test. To learn how to create an ad group, see ad group settings.
- Set a budget, key metric, and schedule for the A/B test.
Feature Description Test budget Enter the percentage of budget from the campaign to allocate to the A/B test. You can allocate up to 100% of the campaign budget. Test budget is split evenly among the control and test groups. If a different amount of daily budget is set for each day of the week, the daily A/B test budget is adjusted according to the daily budget allocated for each day of the week. Key metric Select the metric to track for the A/B test. You can choose from CPA, CPI, and CPC. Note that CPA is only available for in-app event optimization campaigns and the events used to measure ad group performance is the same as the campaign's optimization target events. Schedule Specify a start date and an end date for the A/B test. Timezone is the same as the ad account timezone. You can change the end date at anytime. A/B tests must run for at least 24 hours but not over 30 days. - Click Create to launch the A/B test. Keep in mind that after the A/B test has been activated, you can't update the assigned creative groups or targets and test budget. A/B tests can't be re-activated after they have been completed and you must set up a new A/B test if you would like to re-run the test with the same creatives or targets. You can always extend the end date of the A/B test up to 30 days from the start date to collect more data.
A/B test status and results
Important: For iOS campaigns, you can only check A/B test results based on MMP attribution data. Results based on SKAN attribution data aren't available at this time.
Under A/B Test groups, you can see the status of A/B tests created so far and check out the results of A/B tests that have been completed. Follow the steps below to see a list of A/B tests created so far and A/B test results.
-
Select an ad account from the drop-down menu on the top left of your screen.
-
Select an app from the left sidebar.
-
Click a campaign from the list of campaigns under Campaigns.
-
Scroll down to find A/B Test groups. You can see the list of A/B tests you have created so far.
- Click next to the A/B test of your choice to view the results for creative groups or targets used in the test.
- To see the outcome of an A/B test, click the title of the A/B test of your choice. This takes you to the page where you can find all details about the A/B test as well as the outcome. Under Test Result, you can find the best performing test group that produced significant results and test groups that outperformed the control group, if any.
: Best performing test group with statistically significant results
: Test group that outperformed the control group
Under the Overall performance tab, Moloco also provides suggestions for running the A/B test based on information available at the time. As results are likely to vary across the different key metrics, we recommend running a comprehensive analysis of results based on all the available metrics.Suggestion Description It's still too early to determine a winning group as no statistically significant results have been achieved. We recommend running the A/B test until the scheduled deadline or updating the test schedule to a longer period of time to collect sufficient data for analysis. {{control or test group}}
has achieved{[n}}
% less{{key metric}}
than the{{test or control group}}
.The best performing group has been identified and there is no need to run another test. The best performing group is identified only when the confidence level exceeds 75%. There's no clear performance difference between your test groups. We have identified one or more test groups that outperformed the control group but the difference in performance isn't statistically significant, as the confidence level is below 75%. Clicking Apply to ad groups lets you copy over the configurations from the best performing group to one or more existing ad groups or a new ad group.
Under the Daily breakdown tab, you can compare the daily performance of control and test group(s) based on one of the available metrics you can specify using the dropdown menu on the left. The metrics you can choose from are the following. To learn more about each of the available metrics, see the definitions.- Spend
- Impression
- Click
- Install
- Engaged View
- Revenue
- Action
- CTR