In this episode, we are exploring the idea of audience overlap and how it may impact your Facebook ad campaigns.
Preventing ad fatigue is something all advertisers should be aware of. However, the methodology for avoiding this fatigue varies depending on who you listen to. Facebook has created audience tools and it’s within your best interest to let their technology optimize accordingly.
Kevin and Spencer share the different scenarios for audience overlap and what may be best for your campaigns.
What is Audience Overlap?
When we are running multiple ad campaigns on Facebook the same person may be in multiple audiences we are targeting. A good example of this is website visitors and lead conversions in the last 30 days.
In this case, the lead or optin conversions came through the website and would be in the website visitors audience. This would cause an audience overlap.
In order to prevent overlap in this example, we would exclude the optins from the adset targeting website visitors.
When does overlap no longer matter?
When targeting cold audiences with Facebook Ads, we are less concerned with audience overlap. An example of this would be interest targeting such as targeting Women 18 to 35 interested in Yoga or Meditation. There is probably some overlap between those audiences but we don’t typically exclude each interest from the other.
The main reason for this is we usually have 20 – 50 interests we are targeting. We have not found the impact to be significant enough to do the extra effort need to set it up.
The same is through when using multiple lookalike audiences. Based on what audience the lookalike audience is built on, it may also have considerable overlap.
Split testing creatives to the same audience
Many people try to split test their creatives by duplicating adsets targeting the same audience. This is one of the most common causes of audience overlap. The other issue with this is Facebook does not do a great job of balancing the traffic between the adsets, skewing the test results.
To help us get better test results, Facebook introduced a built-in split testing tool that will split an audience across up to 5 adsets. The audience you define for the campaign has to be the same for each adset, and Facebook ensures there is no overlap between them.
Ad delivery within the split test is also much more stable than the old method.
The downside of this method of testing is the campaign must be scheduled to run for 3 days. Also, you cannot pause a single adset in the group if you feel it has no chance of winning. You have to keep each of the adsets running until a winner is selected, the campaign completes or is paused manually.
The more variations you have to split test, the more budget will be needed to get statistically significant results. Here is a tool from Neil Patel to determine or project statistical significance on an a/b split test of two variations.
Campaign Budget Optimization
We are seeing less impact from overlap when we are using campaign budget optimization. Since the optimization is moved to the campaign level instead of the adset level, Facebook seems to manage the overlap better.
In CBO campaigns we have also started using multiple ad creatives under the same adset. This helps Facebook show different creatives to the same people.
Additionally, multiple creatives also improves the performance on each of the placements by giving Facebook several options to find the right create for the right person on the right placement.
Here is another episode where we go deeper into Campaign Budget Optimization.
An alternative philosophy…
We always try to prevent overlap in our active campaigns as much as possible. However, we have heard reports of campaigns successfully scaled up by simply targeting the exact same audience across multiple campaigns.
We have not been able to duplicate their success when testing the same method. I have also seen many reports of failed campaigns using that strategy that started out well and then crashed. Whenever we are designing a campaign to scale, we are designing it for long term results.
I guess the final word is to test for yourself and find out what works best for your business.