Table of contents

    When creating a campaign, many advertisers ask themselves a number of questions: what is the best way to optimize it for, where to broadcast it, or to which audience it is worth targeting. The elements that can determine the success or failure of a campaign are indeed many. However, comparative testing, called A/B testing, can come to our aid in this regard.

    a/b tests of Facebook ad campaigns

    What are A/B tests?

    An A/B test, in a nutshell, involves comparing two versions of an ad strategy by selecting one variable and testing on the audience which variant performs better or produces more satisfactory results. Facebook displays each version of the ad to the appropriate audience segment so that no user sees both of them. Using comparative testing can make it easier for us to assess which strategy works best for our business.

    Testing is crucial for Facebook optimization techniques. You can test different elements of your ad campaigns like creative, placement, ad text, etc. Experimenting with these elements can help you create a campaign perfectly tailored to your audience. 

    How do we create A/B tests?

    Before you start creating an A/B test, it’s a good idea to define its purpose at the outset, i.e., the particular metric you’d like to focus on first and foremost. If you want, you can even define a hypothesis for your test so that you can later check whether its results match your initial assumptions. This also gives us the opportunity to take specific actions when creating more campaigns in the future.

    The A/B tests of Facebook ad campaigns themselves can be created in several ways:

    • Directly in the Experiments tab — in the Experiments tool you can create and duplicate ad campaigns to compare strategies and choose the best one. This tool will be especially useful when you want to refine an ad campaign before the test begins or use several existing ad campaigns for its purposes.
    • When creating a new ad campaign — you can also choose to conduct an A/B test when creating an ad campaign. This option can be useful for people who want an easy way to test a variable when putting up a new campaign.
    • Replicate an existing element (campaign, ad set or also ad) — you can also select the element you are interested in and create an A/B test in the Ads Manager by replicating it. This way will work well for people who want to quickly make changes within an existing campaign and compare the results.

    At the very beginning of creating an A/B test, we need to choose one of the available parameters that we want to test:

    • Advertising material — will allow us to see which image/graphic, text, headline or CTA buttons are most effective.
    • Audience group — thanks to this variable, we can check what types of audiences are most interested in our ad.
    • Placement — will tell us where our ads convert best: on computers, mobile devices, in automatic placements or standard ones. 
    • Custom setup — with this option, you can create a test by duplicating the selected element (campaign or ad set) and editing any variable in the new test campaign.

    After selecting the parameter, we name our test and specify the condition for determining the winner. Here we have the following options to choose from:

    • Cost of result (depending on the goal of the campaign, it can be cost per click, cost per contact acquisition, etc.)
    • CPC (cost per link click)
    • Cost of reaching 1000 accounts
    • Cost of purchase
    • Standard events — such as standard or custom conversions 

    It is recommended to choose the parameter that will best suit the purpose of our campaign. Optionally, we can also select up to nine other additional indicators that will be included later in the report

    The next step is to set the start and end date of the test in the schedule. Facebook recommends testing for at least four days. Shorter tests can make our results distorted (there is a risk of generating insufficient data), while too long tests (more than 14 days) can lead to unnecessary budget burnout. We should set the optimal test time between 4 and 7 days. 

    Additionally, we can also enable the option “End the test earlier if a winner is selected”. This means that the test can end before the scheduled date if a winner is selected and the probability of winning is greater than 80%. However, it is worth mentioning that the ads will continue to be displayed until the scheduled end date or until they are turned off.

    Good practices when creating A/B tests

    • When creating A/B tests, try to test only one variable per test, leaving the rest of the campaign elements unchanged. Then we have a better chance of getting more meaningful results. Otherwise, when at least two changes are made at the same time, we won’t be able to clearly determine exactly which element influenced the achieved result. Of course, you can test more than one variable, but beforehand make sure you test them one at a time.
    • Make sure that your audience group is large enough to avoid insufficient display. Because the audience group is split for the purpose of A/B testing, the ad sets that are part of the tests may be more prone to the problem of insufficient display associated with too small a group size. In order to avoid this, in some cases it may be necessary to expand the audience group.
    • Set appropriate budget Insufficient display can also be caused by too low a budget. If this happens in your A/B test, you can try increasing it to reach a larger group of people.
    • When testing audiences, placement or display, be sure to check whether the ad sets are sufficiently differentiated. If their similarity is too high, determining the winning ad may not be reliable.
    • Remember also that once the test has started, when its status is “In progress” you cannot make changes to the settings. Therefore, plan your activities carefully and make sure everything is set up properly before publishing. You can, however, reschedule the test or cancel it altogether if you no longer want to run it.
    • If you choose to create a test by duplicating an existing campaign, your ad budget will also be duplicated. Otherwise, Facebook will use your current ad budgets, which you can change in the Ad Manager. In this case, if you are testing two sets of ads, it is important that both have the same daily budget. Then we have a chance to get the most reliable results. On the other hand, if we are testing ads within one set, then the sets in our campaign may have different daily budgets.

    Discover the power of A/B testing on Facebook!

    Would you like to know which ads appeal best to your target audience? Or do you want to maximize the potential of Facebook and achieve the best results for your brand? If so, you’ve come to the right place. At schoolUP, we offer one-on-one coaching sessions led by experienced practitioners.

    A/B test results

    A/B tests all use the same technology to compare different advertising materials, audiences and placements and to determine the winner. All test results once completed will be available in the Experiments tool. All we need to do is select the Results tab from the menu on the left. It is here that we will see an accessible list of all the experiments we have conducted. 

    The results of our tests are displayed as follows:

    • The message “The winning ad was selected” informs us about the ad strategy yielding better results, along with a note whether this strategy won by a small or large margin. It is worth remembering that the winning campaign is determined by comparing the cost of each campaign’s score (the score will be the event we chose when creating the test)
    • The total results recorded for each advertising strategy include the results for cost per result, impressions, spend and other metrics seen in the advertising campaigns used in the A/B test.

    The conclusions of the tests conducted are worth meticulously recording for yourself. You can then apply such acquired knowledge in practice, duplicating those most effective advertising creations and improving them based on the data obtained. 

    Don’t be afraid to also create new campaign variations consisting of the best graphics and audience groups. Just remember to check the results on an ongoing basis each time.

    There is no need to be afraid to create new variants.

    Treat each successive advertising campaign as an opportunity to learn new lessons. By this, I don’t mean endless testing, but verifying assumptions in practice and updating them regularly. It’s also worth bearing in mind that A/B testing is only one of many tools available, and we should use it  in combination with other analytical methods. Remember that the success of your business in marketing requires regular improvement and adaptation to the ever-changing needs of the market. 


    A/B tests can be extremely helpful, especially at the beginning of your journey with Facebook Ads campaigns. They can help you gain an understanding of what works better for your business and what contributes the most to your results. After time, you will have your tactics worked out enough that effective ad creation will come much easier. Benchmarking tests, however, is an unquestionable financial savings to begin with, as we do not invest our money in those elements that do not work for our audience. 

    Let's talk!

    Aleksandra Wrońska
    Aleksandra Wrońska

    She has been involved in internet marketing for two years, but despite her relatively short presence in the industry, she has already conducted advertising campaigns for small and large companies, both on the Polish and foreign markets. He treats digital marketing not only as a job, but also as a passion, which is why he tries to expand his knowledge and skills every day. She joined Up&More in January 2023, where she manages projects in Facebook Ads, Google Ads and Apple Search Ads.