fbpx

Is creative testing on Meta always a good idea? Not necessarily, especially for accounts grappling with specific financial constraints, even if your favorite agency/thought leader/Meta bro is shouting from the rooftops about irresistible hooks and aggressive creative testing. 

Dive into this episode as we dissect a compelling case study that highlights the often-overlooked complexities of introducing new creatives into campaigns. From algorithmic biases to budget limitations, we cover it all. Learn when it makes sense to innovate and when it’s better to stick with what’s already working. A must-listen for anyone navigating the intricate waters of Meta advertising.
You can read more about our recommended philosophy and approaches to creative testing in our recent book: Definitive Guide to Creative Testing in a Post-ATT World.

***





ABOUT ROCKETSHIP HQ: Website | LinkedIn  | Twitter | YouTube


FULL TRANSCRIPT BELOW

Today, let’s dive into why creative testing isn’t the best strategy for everyone, particularly for accounts with certain financial constraints. A recent audit I conducted exemplifies this point.

The Case Study: High CPA, High LTV, and Small Budget

Consider an account with high CPA and LTV values, both exceeding $250, operating on a slim budget of about $12,000 a month. The account, which has been active for many years, sees approximately 1 or 2 conversions daily(after accounting for SKAN obfuscations). Despite the low number of conversions, the account has well-established creative assets that consistently perform well.

The Dilemma: Is Creative Testing Worth It?

You might wonder, “Should I test new creatives?” Should you try to improve upon the performance of your existing creatives? Your new designer came up with a ton of innovative creative ideas – and after all, isnt creative testing the rage among the bros lately?

Given the budget constraints and the exclusive focus on the U.S. market that this product has, my answer leans towards “No.” 

Let’s do some back-of-envelope math here and assume that you’re running a creative test optimizing for installs, which is much cheaper than optimizing for purchases and actions. This typically will cost you $100 a day – which is 25% of your budget, and it’ll go to unproven creatives whose costs you should really write off – as they are also optimizing for installs and are unlikely to result in purchases.

Is it worth it to spend 25% of your budget on unproven creatives? For some, the answer may be yes if you can unearth a new breakout winner from this test that can maybe slash your CPA by half…..

But things are not nearly as simple even if you do identify a breakout winner.

The Algorithmic Challenge: Existing vs. New Creatives

Suppose you find a winner among the new concepts. Integrating this into your existing campaign presents a hurdle: Meta’s algorithm often neglects new creatives in favor of established ones. We’ve witnessed this firsthand with the account in question – and literally every account we’ve seen – as I imagine you have too.

So: any new winner from your creative test is basically useless if Meta’s algorithm favors older established ads and does not give any spend to your winner.

Alternate Strategies and Associated Risks

You could segregate the new winner into a separate core campaign and ad group that is separate from your historical creatives, but the budget constraints make this impractical. This brand is not able to double their monthly budget from $12000 to $24000 overnight – and I dont any brand should double the budgets just for the sake of testing.

Another risky maneuver would be to disable the existing creatives to let the new one gain traction, but this could severely compromise overall performance. What if your new creatives dont perform as well? Are you willing to jeopardize your current performance? 

And even if your new creatives are objectively ‘strong’, Meta’s algorithm is going to take some time to ‘learn’ – so it’s going to take a few weeks for the new creatives to show performance parity with your existing creatives, especially since such a high CPA or LTV will slow down the learning process even further.

Conclusion and Recommendations

If you’re managing an account like the one discussed, my advice is to avoid creative tests. What you could do is introduce minor variations to existing creatives into a core campaign. 

The only exception to this would be if your overall performance is really bad – and you need to hit the reset button, in which case you could kill your existing creatives and run new ones.

Broad testing should be reserved for accounts that have larger budgets and can basically write off a portion of their spends as ‘testing budgets.’

A REQUEST BEFORE YOU GO

I have a very important favor to ask, which as those of you who know me know I don’t do often. If you get any pleasure or inspiration from this episode, could you PLEASE leave a review on your favorite podcasting platform – be it iTunes, Overcast, Spotify, or wherever you get your podcast fix? This podcast is very much a labor of love – and each episode takes many many hours to put together. When you write a review, it will not only be a great deal of encouragement to us, but it will also support getting the word out about the Mobile User Acquisition Show.

Constructive criticism and suggestions for improvement are welcome, whether on podcasting platforms – or by email to shamanth@rocketshiphq.com. We read all the reviews & I want to make this podcast better.

Thank you – and I look forward to seeing you with the next episode!

WANT TO SCALE PROFITABLY IN A GENERATIVE AI WORLD ?

Get our free newsletter. The Mobile User Acquisition Show is a show by practitioners, for practitioners, featuring insights from the bleeding-edge of growth. Our guests are some of the smartest folks we know that are on the hardest problems in growth.