We talked earlier about how you should minimize the cost of creative testing whenever possible. Today we talk about the flip side of low cost – how do you ensure low cost isn’t giving you junk results?
How you do this is by testing your creative testing process, which I explain in today’s episode.
We go into this – and every other aspect of creative testing post-ATT in our new book: The Definitive Guide to Meta Creative Testing post-ATT, which you can download for free today at the link here: https://www.rocketshiphq.com/meta-fb-creative-testing-guide/
***
ABOUT ROCKETSHIP HQ: Website | LinkedIn | Twitter | YouTube
FULL TRANSCRIPT BELOW
Today’s episode covers a topic that we address in a lot more detail in our new book: Definitive Guide to Meta Creative Testing in a Post-Identifier World, which covers every aspect of how to run creative tests post-ATT, so you can run your tests in a world of incomplete data to discover winning ads with confidence.
You can check out this and our other books at rocketshiphq.com/playbooks.
In a recent episode of the podcast we went over some considerations for *where* to run your creative tests: I shared that the most important consideration for where to run creative tests is low cost.
The #1 follow up I get when I share this is: what about reliability of the test itself? If you minimize the cost of your testing, perhaps by testing in a low cost geo optimizing for an upstream event, are you sure your results are even going to be reliable?
Aren’t low cost tests with inaccurate results a complete waste of time?
I 100% agree.
Which is why I always recommend testing your creative test process whenever possible, and not just running the lowest cost tests.
What does this mean in practice?
If you’re running a creative test in a low-cost geo, I always recommend running a test when possible in your core geo(most often US) – to compare the results from your low-cost geo and the US.
In the vast majority of cases, we see the results in low-cost geos to be very similar to those in the US – and I expect that this is generally because people are fundamentally similar and respond to similar ads and appeals, irrespective of where they are.
BUT you should test your results for yourself.
In fact, when we did ‘test our testing process’ to see if a creative test optimizing for clicks in a low cost geo would give results that were comparable to performance in a core geo, and the results were clear – results for click optimized tests were not reliable.
And that is precisely why you should test your creative testing process, whenever it is realistically possible for you.
Why do I say ‘whenever possible’, when I recommend testing your creative test process?
That’s because oftentimes, especially when you are at small budgets, it’s just not realistic or possible to test your testing process. If you are at small budgets, you really dont have the budgets to run a test in the US to compare your test results from a low cost geo.
In these cases, it’s often best to minimize cost and trust that you’ll get at least directional results from your low-cost tests.
But whenever possible in other scenarios, I recommend testing your testing process – to make your creative testing methodology bulletproof.
Today’s episode covers a topic that we address in a lot more detail in our new book: Definitive Guide to Meta Creative Testing in a Post-Identifier World, which covers every aspect of how to run creative tests post-ATT, so you can run your tests in a world of incomplete data to discover winning ads with confidence.
You can check out this and our other books at rocketshiphq.com/playbooks.
A REQUEST BEFORE YOU GO
I have a very important favor to ask, which as those of you who know me know I don’t do often. If you get any pleasure or inspiration from this episode, could you PLEASE leave a review on your favorite podcasting platform – be it iTunes, Overcast, Spotify or wherever you get your podcast fix. This podcast is very much a labor of love – and each episode takes many many hours to put together. When you write a review, it will not only be a great deal of encouragement to us, but it will also support getting the word out about the Mobile User Acquisition Show.
Constructive criticism and suggestions for improvement are welcome, whether on podcasting platforms – or by email to shamanth at rocketshiphq.com. We read all reviews & I want to make this podcast better.
Thank you – and I look forward to seeing you with the next episode!