The A/B Testing Platform serves as your Test-and-Learn lab. It enables you to execute a structured testing strategy so that your marketing activity is always data-driven. You can simultaneously run a countless number of tests and track performance in real-time.
Our A/B Testing solution allows you to:
Create controlled A/B Tests across campaigns or ad sets
Evaluate the test results based on statistical significance
Automatically take action based on performance
How to set up an A/B test?
You start by setting up two campaigns to use in the test. There are 3 items you need to check while setting up your campaigns:
Two campaigns are completely identical except the one difference that you want to test
The ad sets are the same
The campaigns are paused
In order to learn how to create a successful A/B test, you can visit the A/B Test Guide document.
When the campaigns are ready and you're in the A/B test creation page (which can be seen below) you can start building the test by clicking on the "Create an A/B Test" button or the green "Create Test" button in the upper right corner of the screen.
When you click on the button, a pop-up where you can enter the data will appear.
Required fields on the Create A/B Test form
Test Name: Name your A/B test so that you can easily search for your A/B tests
Test Description: Describe your A/B test goal so that you can easily identify it later
Tag: By defining a tag for your studies, you can categorize your A/B tests. (e.g., creative test) You can define only one tag per A/B test
Facebook Account: Select the Facebook Account that is connected to your Business
Facebook Business: Select the Facebook Business to which you would like to assign your A/B test
Start Date: Enter the date you would like to start the A/B test
End Date: Enter the date you would like to end the A/B test (Please note: Maximum duration of an A/B test is 90 days)
Test Across: At the bottom of the screenshot above, you will see that the creation modal is asking you to decide whether you will test across campaigns or ad sets. In this screenshot, we're seeing that campaigns is selected, since that is the proposed method from Adphorus. However, please note that you can also test across ad sets. This means that you will have two identical ad sets under one campaign that you are testing for.
Study Groups: Scroll down and write down the two campaigns that will be used in the test. When you write down the Adphorus Campaign ID to the assigned campaigns, the interface will automatically suggest you the campaign that you want to use. You can click on the suggestion to add it to the test.
If you would like, you can add multiple campaigns under one study group. This is useful when you want to test the same aspect for multiple markets, for example. Let’s say that you are testing the same two creatives, train vs. charts (in different languages of course), in 10 different markets. In this case, you can put all the train image campaigns on one study group, and all chart image campaigns on the other. This way, you will be able to prepare the split for 10 tests with one click.
Size: is the percentage of the audience you want to have on that side of the test. Most of the time this will be 50-50, but there could be cases when you want to have different sizes on each side. For those cases, you also have the option to change the percentage using the size.
Assigned Campaigns / Ad Sets: Choose the campaigns / ad sets that you would like to assign to the study group
Mark As Control: When you are doing an A/B test, we ask you to mark one side of the test as the control group, which makes the other side the test group. Preferably the control group should be the side that has the setup that you have been using until now, and the test group should be the side that has the new setup. Please note that choosing one as control does not affect the test or campaign performance. This option is for test results display only.
Comparison Metric: This is the key metric that you will be looking at to determine the winner of the test.
Attribution Window: You also need to define an attribution window that you will be looking at.
Email Notification: Check this option if you want to be notified when the results are statistically significant with a confidence level of 95%.
Then you need to click on "Create" at the bottom right hand corner to create your test. Once you do all the steps here, don’t forget to Activate your campaigns to start the test!
While Your Test is Running
Once your test starts running, you will start seeing the most recent results on the A/B Test panel as shown below.
For each A/B test, your control group will be listed first. Next to each test group, you will see both the spend and comparison metric that you have chosen for the test.
At the right, the "Chance to Beat Control" progress bar shows the progress of the test in real time. "Chance to Beat Control" is displayed for each test group based on the selected comparison metric. It is calculated using the Z-score to determine whether the results of the A/B test are statistically significant. See what each progress bar color signifies below.
Adphorus account admins are also able to change an A/B test’s manager while the test is still active.
Completing Your A/B Test
In order to finish the test, click on "Complete" on the interface.
This will end your split and open a new pop-up window that asks you if you want to pause the campaign(s). Choose the side that lost and click on "Pause Selected". This will pause the campaign that lost the test, while you can keep the winning side active and continue with that campaign.