fbpx

Today’s episode is the recorded version of a webinar we hosted with Liftoff on Winning Creative Strategies for Games.

Our guests were Ayşe Duygu Haykir, Sr. Marketing Manager-Game at Codeway Studios, Jasmine Bautista, Senior Digital Designer at Liftoff Mobile, Faith Price, Director of Growth Marketing at DoubleDown Interactive and Rhiannon Price, Head of Performance Creative at The Sandbox.

In this webinar, we discussed effective research processes, sourcing ideas for new creatives, using player motivations, leveraging AI to our advantage and so much more.

Sign up here to get the key takeaways via emails: https://www.rocketshiphq.com/meta-fb-creative-testing-guide/





ABOUT OUR GUESTS: Ayşe Duygu Haykir | Jasmine Bautista | Faith Price | Rhiannon Price

ABOUT ROCKETSHIP HQ: Website | LinkedIn  | Twitter | YouTube


KEY HIGHLIGHTS

🎱 Research processes using paid and free tools.

🎬 The effectiveness of ad intelligence tools.

🎹 How does the research process differ for different channels?

🛹 AI tools, the evolution and the effectiveness.

⏱ Crafting a hit ad within the first 3 seconds.

🙌🏻 Working with multiple teams to get great ads.

💅🏻 How to get your teams to ease into using AI.

🪐 The mechanics of the creative testing process

🌨 Testing your test process.

🎲 How many creatives should be produced for a game?

🎁 How does the process of ideation and production change for a soft launch?

📊 Relaying creative performance data from the data team to the designers. 

KEY QUOTES

The research process doesn’t stop at external tools.

Faith: We also have an internal Slack channel where our creative team, UA team, and copy team can all share ads they’ve seen or find interesting. We believe in not just restricting our research to the gaming space, so if we come across a fun concept for a consumer product or service, we take a look at that as well.

AI tools that are gaining popularity.

Rhiannon: We’ve also used Murf, an AI voice-over tool, which has been great for pacing videos. AI voices have even worked better than human voices when advertising on TikTok. 

We used AI as an inspirational tool, when we used Midjourney to create an AI-generated image of Winnie the Pooh for a Bingo game ad. It’s a tool that frees us up to do better work.

I use ChatGPT to generate tagline variants and write creative briefs, and it saves me 20%-30% of my time every day.

The first 3 seconds of an ad.

Duygu: It’s crucial to ensure that any UGC aligns with the content presented in the video. The intro and outro of a video should also match and maintain the same level of interest as the hook. If not, there could be a significant drop in conversion rates. Therefore, it’s important to follow up with the same content, tone, and theme throughout the video.

Getting creative on a banner image.

Jasmine: Once I have determined the animation I want to use, the next step is to think about how to make it interesting enough to keep the user’s attention once I grab it. This can include the cadence of the animation, the speed at which it plays, or color changes. I particularly enjoy using color changes as it can evoke a sense of urgency in such a simple animation. This is especially useful in gaming, where we can use UI elements that can transition from green to red to indicate the character’s health bar or time running out, and fit it into the small banner.

These techniques can also be applied to larger spaces such as interstitials or end cards, which can get more complex.

How teams are figuring out working with AI.

Rhiannon: It’s about us creating a library of what these AI tools are, and practical use cases of how we’re applying them to our workflow, and then sharing that information across the company. 

I’m not a big believer in dictating workflow to wider teams. You have to open the curtain, invite people in, and I think showing people case studies and specific use cases of how it has been successful is the best way to spread that kind of new processes across companies

The mechanics of creative testing.

Duygu: We try to keep the testing budget, around 5% to 10%, within the general split. If we want to test something very specific, we try to do split tests and use the actual A/B testing method.

But if we don’t really need specific results and if we want to check the general idea of the concept, then we would go start by putting different creatives in different ad sets, getting the same budget, and trying to see the results.

How does production change for a soft launch?

Faith: To ensure that we’re creating effective marketing campaigns for our games, we begin by working with the game design team to identify the most important pillars of the game. 

Once we’ve determined these pillars, we collaborate with the copy and creative teams to come up with taglines and creatives that will appeal to our target audience. 

For example, we recently launched a game with three different components, and we worked with the team to determine which direction would resonate the most with players. From there, we conducted A/B testing to determine the most effective messaging and creatives, and we use that feedback to continually iterate and improve our campaigns.

How does team collaboration work?

Faith: Each UA team member has slides in a deck and they present the top performing creatives and share competitive research.

Collaboration and proximity are crucial for successful creative testing. At work, we intentionally sit next to the creative marketing team to facilitate collaboration. Although remote work is now prevalent, we still have a weekly collaboration day where everyone in the company comes together to review and discuss creative projects in person.

FULL TRANSCRIPT

Shamanth:

Welcome to our International Women’s Day edition of the “Winning Ad Creatives for Games” webinar. We are thrilled to have an all-star panel here today, with close to 400 registrations. Let’s introduce our panelists: 

Duygu Haykir, Senior Marketing Manager for Games at Codeway Studios; 

Jasmine Bautista, Senior Digital Designer at Liftoff; 

Faith Price, Director of Growth Marketing at DoubleDown Interactive and 

Rhiannon Price, Head of Performance Creative at The Sandbox.

To start the discussion, we would love to hear from our panelists about their research process. 

Facebook Ads Library, TikTok Creative Center, and other tools can provide valuable insights that lead to creative ideas. 

Rhiannon, would you like to share your thoughts?

Rhiannon Price: 

I use Sensor Tower or App Annie, but that’s not really always feasible, especially if you’re in a small company. 

However, there are still ways to gain insights from competitor ads. By examining the creative assets that are being used, such as colors and composition, it’s possible to identify common patterns and trends. Advertisers are likely testing and iterating on these assets because they see positive results. By grouping these common patterns together, even without the use of paid tools, you can begin to get a sense of what’s working and how you can apply these learnings to your own creative strategy. 

So, while paid tools can certainly provide an advantage, it’s still possible to gather valuable information and improve your own performance by carefully analyzing competitor ads.

Shamanth Rao:  

We’ll talk about some of the tools and the price points that can be a hurdle for them in a bit. Faith, I’m curious if you could talk to us about your research process as well.

Faith Price : 

In terms of our research process, we strive for egalitarianism. This means that everyone on our marketing creative team and UA team is encouraged to conduct their own research. We utilize resources such as the Facebook library, the TikTok library and the AppAgent repository which includes interesting ads. We also have an internal Slack channel where our creative team, UA team, and copy team can all share ads they’ve seen or find interesting. We believe in not just restricting our research to the gaming space, so if we come across a fun concept for a consumer product or service, we take a look at that as well.

Additionally, we have access to a paid service similar to what Rhiannon mentioned. We look for themes and analyze what our competitors in the space or other genres are using. We also look at geos to understand what’s being done for a particular geo, and whether or not localization is necessary. If our competitors are localizing for a specific geo then we’ll take note and test something similar.

Rhiannon Price: 

Faith mentioned something about having a shared Slack channel. That’s something I’ve done at my current job and previous job and it’s super successful. 

If you can create a really open space where no one is judged for anything they share, there’s opportunity for great research.

Shamanth Rao: 

100%. Having the entire team’s inputs can be incredibly valuable in the research process. For example, at our company, we use an Airtable database to tag and review ads, looking for patterns across different genres. This is just one way that we can spot interesting trends and gain insights from the collective input of our team.

Just to switch gears a bit, I know we started talking about ad intelligence tools, how effective do you guys find the ad intelligence tools to be? What are some of the pros and cons of some of the paid tools versus free tools? 

Duygu, I know you guys use quite a bit of these. So I am curious if you can speak to this.

Duygu Haykir: 

At our company, we use both free and paid ad intelligence tools. The paid tools we use include Sensor Tower and Mobile Action, which can be expensive but are useful for daily activities. 

Free tools only provide the ads themselves, without the data behind them. 

With the paid tools, we can conduct competitive research, network analysis, and creative analysis to gain insights on ad performance, which we can compare to download spikes in trendlines for even more insights. However, we don’t limit ourselves to just paid tools and also make use of free tools available. 

On TikTok, things are a bit different, as the platform doesn’t show ads for apps that aren’t shared. So, we train our organic TikTok algorithms based on competitor ads that we follow, so we can gain more insights on what ads they’re pushing more.

Shamanth Rao:  

So your personal feed could pull up a lot of ads that you will be able to see and study. Also Duygu, what you said about the ad intelligence tools, it sounds like these are directionally right. I think to really get the most out of it, you want to triage these with other data.

Duygu Haykir  

Yeah, I just want to add that doing weekly research can help you become familiar with the data and historical trends. By comparing the patterns that work for others with your own data, you can get a better understanding of what works and what doesn’t.

Shamanth Rao:  

And how do your processes differ for different networks or platforms? 

We talked about TikTok, and how you’re getting some of those ideas. How does this change for DSPs and ad networks for that matter? 

And how do you account for any recent trends that may be somewhat more channel specific? For instance, this could be UGC focused or live actor videos when you’re generating any ideas. Faith, I’m curious if you can speak to that.

Faith Price:  

We collaborate closely with our partners to understand what works for their specific channels and platforms, and hold semi-regular meetings with them to discuss the creatives they’re seeing. 

When we expanded to CTV, we talked to CTV agencies and partners to get examples of successful game and non-game creatives. We didn’t have time or resources for bespoke creatives, so we tested some of our existing app store videos which are more cinematic and offer a larger view of the product. 

We invested in voiceovers and A/B tested different versions. We took advantage of existing partnerships and slowly expanded as we got more data and had more time.

Shamanth Rao:  

Sure, and what is working on Meta or some of the core channels will not work on the others. Like Duygu mentioned, for TikTok, the research process is very different. 

To switch gears a bit, one of the phrases or words that’s really in the zeitgeist is AI. So I’m curious to hear from you, are there AI tools that you guys use in the ideation process? 

Which ones and how do you use them? Jasmine, if you want to start off….

Jasmine Bautista:  

The creative studio at Liftoff as a whole is definitely looking into AI to design pretty crazy creatives. I know from my personal experience as a designer, I’ve been throwing around ChatGPT to start as a baseline for creative concepts or even give some copy ideas. 

But, part of being in the creative studio at Liftoff is to be innovative and create out-of-the-box ideas. AI is definitely a tool that we can use to jumpstart that creative process. 

Shamanth Rao:  

Certainly, Rhiannon, you were the one who suggested we jump into this as our topic. So I’m curious to hear from you. 

What are the tools you’re using? How are you using them and what impact has it had so far?

Rhiannon Price : 

ChatGPT has become a major tool in how we operate and approach work, and I use it every day.

We’ve also used Murf, an AI voice-over tool, which has been great for pacing videos. AI voices have even worked better than human voices when advertising on TikTok. 

We used AI as an inspirational tool, when we used Midjourney to create an AI-generated image of Winnie the Pooh for a Bingo game ad. It’s a tool that frees us up to do better work.

I use ChatGPT to generate tagline variants and write creative briefs, and it saves me 20%-30% of my time every day. 

Shamanth Rao: 

It’s definitely been a game-changer for most people I’ve spoken to. I can see that marketers are able to speak the designer’s language now by using tools like Dall-E for example.

To go on to a different topic. The first three seconds or the hook are critical and we all know this as we’ve seen data around this. What are some of the things you guys do or have seen work and maximized the impact of the first three seconds? 

Duygu, if you can share your experiences.

Duygu Haykir:  

As we focus on both apps and gaming, our approach varies depending on the platform. One effective way to engage users is through UGC, such as showcasing reactions to a product or offering users a choice to make. It could also be as simple as featuring aesthetically pleasing callouts or presenting a problem for users to solve. We also place a lot of importance on using trendy sounds and music, constantly testing and experimenting with new options.

However, it’s crucial to ensure that any UGC aligns with the content presented in the video. The intro and outro of a video should also match and maintain the same level of interest as the hook. If not, there could be a significant drop in conversion rates. Therefore, it’s important to follow up with the same content, tone, and theme throughout the video.

Shamanth Rao:  

Yeah, that consistency is so critical. Jasmine, I’m curious if you can share what your perspective is.

Jasmine Bautista:  

I mainly focus on designing banners, interstitials and native. For me, it’s two elements that I find very important in a creative, which is animations, movements, and then just drama.

Let me take you through an example of designing a banner creative. A banner creative is unique due to the limited real estate available for creativity. Therefore, the first thing I do is analyze the space and figure out how I can create movement to catch the user’s attention in the first three seconds. This can include spatial movement from point A to point B, panning images, switching through tags, or having numbers counting up and down.

Once I have determined the animation I want to use, the next step is to think about how to make it interesting enough to keep the user’s attention once I grab it. This can include the cadence of the animation, the speed at which it plays, or color changes. I particularly enjoy using color changes as it can evoke a sense of urgency in such a simple animation. This is especially useful in gaming, where we can use UI elements that can transition from green to red to indicate the character’s health bar or time running out, and fit it into the small banner.

These techniques can also be applied to larger spaces such as interstitials or end cards, which can get more complex. 

In summary, designing creative content is a fun and exciting challenge that requires careful consideration of animations/movements and drama to effectively capture the user’s attention.

Shamanth Rao: 

Yeah, that’s a whole new challenge about fitting a lot of elements into a very small digital space. We talked a bit about what the processes look like on your teams. 

Faith you said, a lot of folks add ideas to their Slack channel. What does the meeting cadence look like, on your team for deciding what ads actually get made? 

Faith Price: 

When it comes to generating new ideas, we tend to focus on what the designers are most interested in pursuing. However, for iterating on existing ideas, our decisions are mainly driven by data. We closely monitor A/B tests and choose successful ideas to iterate on. This could involve something as simple as color iterations or more complex tasks such as testing the concept and structure across multiple characters and IPs.

Shamanth Rao:  

Jasmine, how do you guys work with your teams? 

Jasmine Bautista:  

I’m part of the production team. There’s another team with creative strategists that do all the research and strategy and communicate it to the production team. This includes motion, digital and interactive. 

So this can kick start cross-team brainstorms, where we start with this essential data that the creative strategist will present. Then we’ll start brainstorming and get those creative juices flowing.

I always include team members from all the production teams and we’ll share creative learnings. These sessions can happen weekly, or even multiple times a week, if a creative strategist recognizes we need a creative refresh, or we have a creative opportunity for a certain customer.

Shamanth Rao:  

Rhiannon, if we could go over to you, you talked about AI, and how exciting it is. But staying on the topic of teams, how easy do you find it to have teams adopt AI-based tools? 

I also ask, because there’s a divide in the level of enthusiasm among different team members for new tools and stuff, or to actually integrate them in the workflow. So I’m curious to hear from you, how easy has it been? What are some of the things you feel you’re doing to make sure everybody adopts these tools as a part of the day-to-day workflow?

Rhiannon Price:  

I would definitely say we’re still figuring that out. I don’t know if anybody has really fully integrated it, because the possibilities are limitless. 

However, I’m part of the growth team at The Sandbox. We are predominantly focused on maximizing return on investment and working really fast to market changes or in product changes. So innately as a growth team, we are focused on time efficiency, and AI 100% supports that process. 

Our head of growth has tasked us as leaders within our areas to really focus on and find AI tools that support what we do. It’s about us creating a library of what these AI tools are, and practical use cases of how we’re applying them to our workflow, and then sharing that information across the company. 

I’m not a big believer in dictating workflow to wider teams. You have to open the curtain, invite people in, and I think showing people case studies and specific use cases of how it has been successful is the best way to spread that kind of new processes across companies.

Shamanth Rao:  

Yeah and let them experience the new workflow for themselves. Certainly something we’ve been working around as well. We’ve been playing around with a lot of tools and I think the one thing we are noticing is that we are able to get a lot more marketers inputs. These tools just help you surface ideas that you wouldn’t have thought of before.

Rhiannon Price:  

Also there are people with certain skill sets that feel pretty threatened by a lot of these tools. Copywriting is an extremely complex skill to have and I don’t think ChatGPT is replacing copywriters right now. I know someone who has an in-house copywriter and they did a bit of a test like human vs. ChatGPT without telling the copywriter. 

The copywriter’s work, by far, exceeded the specificity and the magic behind copywriting than what ChatGPT did. ChatGPT is definitely limited by the data it has, and it’s only as good as the prompts that it’s receiving.

Shamanth Rao:  

I was talking to somebody who said that an AI will not take your job, but somebody who knows how to use AI will and I thought that was a very good statement. 

Just to switch gears, we talked about ideating, we talked about generating ideas, producing them and producing ads. Can you talk to us about the mechanics of the actual creative testing process on Meta or Facebook which for many, tends to be a fairly big channel? 

What are the geos that you use? What are the campaign structures that you use? Is Meta even a good place to run tests? 

Duygu Haykir:  

We usually go with our top and active geos, but we try to test in a lot of different geos, to see if they replicate the test results in the same way with cheaper results. Since our top active geos are very competitive and expensive, for the campaign structure, it really depends on what you’re trying to achieve and what type of test you’re running. 

So we try to keep the testing budget, around 5% to 10%, within the general split. If we want to test something very specific, we try to do split tests and use the actual A/B testing method.

But if we don’t really need specific results and if we want to check the general idea of the concept, then we would go start by putting different creatives in different ad sets, getting the same budget, and trying to see the results. 

Or if we have like a bunch of creatives that we just want to test and we don’t want to put too much budget on that, then we just put them all together and see which one pops off. So it really depends on what you want to achieve and what type of creatives you’re testing. You don’t only test on Meta if it’s not your main source of traffic. It really differs from platform to platform. So if it’s ads for TikTok, then we test it on TikTok, or we try to do tests on different platforms, depending on the campaign, creative, and the concept. 

Shamanth Rao:  

Faith, I know you guys run on a lot of non-Meta channels and I know you have shared that you don’t do a lot on Facebook. So I’m curious what’s the typical mechanism of running tests on other channels and how much of an overlap do you typically see between some of the winners on other channels and Facebook? Can you talk to us about the mechanics of running tests on other channels and what that looks like?

Faith Price:  

We basically have found that there’s not a huge crossover for us, because we run a fairly diversified media stack. So there’s not a huge crossover between what performs on Facebook and everywhere else. 

To optimize our ad creatives for each platform, our media buyers work closely with partners to develop A/B testing strategies that work best for their particular channels. For example, we recognize that what performs well on DSPs and TikTok is very different from what works on other platforms. To determine what ad creatives work best, we take into consideration factors such as the specific types of ads, whether they are banners or rewarded videos, and what themes or IPs were successful on Facebook.

We also work with partners to determine the best cadence for our ads on platforms like TikTok, where fatigue can set in quickly. While we do take some learnings from Facebook, there is no one-to-one correlation between what works on that platform and others.

Shamanth Rao:  

That’s something we do as well. We run separate testing processes wherever the channel is meaningfully big enough. 

Rhiannon, I’m curious if you can share your perspective on testing on channels other than Meta?

Rhiannon Price:  

A few years ago, A/B testing on Facebook was the go-to method for creative testing. However, now the process is now more complex. Each platform and ad format require a bespoke approach to testing. 

Recently, I compared Facebook and Google for creative testing, and the actual ranking of the ads was very similar, around 80%. This result gave me confidence in using Google as my testing platform, but I understand that results may vary depending on the time of testing. It’s essential to always question and test the test to ensure the validity of the results. 

Creative testing is not just about running cheap tests but about running relevant ads and getting relevant results. Spending money on irrelevant tests is a waste of resources. It’s crucial to stay updated and adapt to the changing landscape of creative testing.

Shamanth Rao:  

I liked the phrase you used about ‘testing your test process’. I find that is so important, because you can use a number of different frameworks. It’s certainly something we’ve been doing more and more of, especially after ATT, post-SKAN. 

On SKAN channels in particular, you have a three-day delay, your decision-making just gets very delayed. If you’ve just paused and restarted, the data looks wonky. So there’s just a bunch of complications and it’s not nearly as definitive as it used to be.

Rhiannon Price:  

I worked with a vendor once and they said they did all their testing in India because it’s really cheap, but you can’t apply those learnings from India over to the US or to tier-one countries. So what’s the point in running all those ads in India, if you’re just going to have to re-run them all at scale in your other geos?

Shamanth Rao:  

We do run some tests in India. One of the things we do is we test the testing process, we compare the results with what’s happening in tier-one geos. 

Something else we also do is we run the first round of tests in India, second round of tests in a Western European English-speaking market and third round of tests in the US.

Rhiannon Price:  

I agree with testing the process of the test. I want to spend the minimum amount of money testing the actual framework.

Shamanth Rao:  

With the creative production itself, how are you guys typically determining the number of creatives you’re producing for each game? Jasmine, I’m curious if you can speak to that.

Jasmine Bautista:  

First, I prioritize creating a basic set of creatives that covers all main inventory types, including banners, interstitials, native, and video. To optimize these creatives for the specific vertical, I reference past testing data to determine the highest performing creative sizes. 

From there, I get creative and create multiple variations of creatives for those popular sizes. For gaming, I incorporate a lot of video and interactive elements, which could include small tweaks to the best-performing creative or brand new concepts that arise from brainstorming sessions. 

Specifically for gaming, I aim to include interactive gameplay or highlight certain aspects of the game to add more variety to the creatives set being tested.

Shamanth Rao:  

Duygu, I’m curious if you can speak to your planning process?

Duygu Haykir:  

We usually test videos and it really depends on the number of videos that are available. We try to have five ads testing and try to group the concepts together. So we test the UGC type of creatives together, test the gameplays together and see the results compared to each other.

On the gaming side, if it’s a strategy game, the production process is a bit slower than on the app side. So we try to test at least one test per week. But on the app side, the production is so fast, with 50 creatives produced each day. It really depends on the number of creatives.

Shamanth Rao:  

How does all of this change for a brand-new game? Or if you’re in soft launch, and you have no frames of reference? Faith, you’ve been through a number of soft launches, I’m curious if you can speak to how the process of ideation and production change for a soft launch.

Faith Price:  

So for us, it starts with working with the product team and the game design team to understand what are all of the core themes, features and mechanics in the game. To ensure that we’re creating effective marketing campaigns for our games, we begin by working with the game design team to identify the most important pillars of the game. 

Once we’ve determined these pillars, we collaborate with the copy and creative teams to come up with taglines and creatives that will appeal to our target audience. 

For example, we recently launched a game with three different components, and we worked with the team to determine which direction would resonate the most with players. From there, we conducted A/B testing to determine the most effective messaging and creatives, and we use that feedback to continually iterate and improve our campaigns.

By starting with a clear understanding of the game’s value proposition and using data-driven insights to inform our marketing efforts, we’re able to create campaigns that truly resonated with our audience.

Shamanth Rao:  

Based on my experience with early stage games and soft launch games, they’re looking for that feedback to inform the game product roadmap as well. When I’ve worked with product folks in the early stages, they want to understand what creatives are winning so they can build more of that in a product pipeline.

We talked about some of the inter-team interactions earlier, how you guys are sharing and gathering ideas. Something that I have seen to be a bit more of a challenge is to communicate results and data to designers who aren’t intuitively skilled at interpreting that data. 

So I’m curious to hear from you guys. What are the typical mechanisms that you guys use for sharing creative performance data from the data folks to designers that are actually working on stuff? 

Duygu Haykir:  

We try to keep the designers in the creative performance evolution as much as possible. They have access to the data and the test results and it’s also their duty to follow up on the test results and check whether creatives are getting volume on the campaigns or not.

We try to keep this as simple as possible. It could be hard for some of them who are not really familiar with the data and the numbers, but we try to onboard them on the main metrics. In the beginning, we walk through the test results together and the reasons behind them. In our regular meetings, we discuss the latest test results together and prioritize what to do next. 

We found out that this really helps them to understand the decisions made on the pipeline, what we are prioritizing, and what we’re not really pushing through. When they research for a new creative idea, they can use this know-how to actually come up with better performing ideas. This is really a motivation factor for them because they see the impact of their efforts. This method is really working for us.

Shamanth Rao:  

I think it can be so important to make sure the designers feel included. 

When you say you’re exposing the designers to all of that data, are they looking at the ad network dashboards? Or is it a simplified version, in a manner that’s somewhat easy for them to look at?

Duygu Haykir:  

We do both. They have access to the ad dashboards itself. But since we run a lot of campaigns on different platforms, we also try to share an automated report on which creatives are spending most of the budget. So they wouldn’t need to go into each platform and check out the campaign. 

Shamanth Rao:  

Something we have been doing more of is to share asynchronous updates within Slack channels and share what’s winning among the ads our designers have made. 

Faith, I’m curious to hear from you. How are you guys handling this?

Faith Price:  

We worked with the creative team to develop the correct cadence to have creative review meetings with them. Currently we do it on a monthly basis where each UA team member has slides in a deck and they present the top performing creatives and share competitive research.

Collaboration and proximity are crucial for successful creative testing. At work, we intentionally sit next to the creative marketing team to facilitate collaboration. Although remote work is now prevalent, we still have a weekly collaboration day where everyone in the company comes together to review and discuss creative projects in person. 

This allows for more effective communication and problem-solving, which are essential for successful creative testing. While remote work has its benefits, it’s important to find ways to maintain collaboration and communication to ensure the best possible outcomes for creative testing.

Shamanth Rao:  

Sure, there’s in-person, synchronous and asynchronous and it sounds like just having that collaboration makes so much sense.

Rhiannon Price:  

I can say that it’s important for marketers or data analysts to break down the metrics and explain them in simple terms to the creative team. We are not experts in metrics and data analysis, so we may not understand the terminology and acronyms used in the reports. 

It’s also important to create a culture where everyone feels comfortable asking questions and seeking clarification. One-on-one sessions with the art leads or people who are invested in the creative side can definitely help in understanding the testing infrastructure and the metrics used to measure the success of creative assets. 

Shamanth Rao: 

Yeah, I think that education can be so critical. One of the things we’ve been trying to focus more on is one single metric to communicate creative performance, because it can become crazy. If you’re looking at click-throughs, install rates etc. it’s great for a marketer. But for cross-team communication, it’s easier to look at one single metric.

That’s something we’ve been doing much more of lately, just to make sure everyone understands the limited number of things rather than discuss ten different variables.

We are very close to time. If any of you guys have closing thoughts, quick takeaways. Please feel free to share.

Rhiannon Price:  

I just wanted to say that these topics are super interesting. If anybody wants to talk more, I’m always open. Hit me up on LinkedIn. And I’d love to chat more about this kind of topic.

Shamanth Rao:  

Certainly, cool. Huge thanks to Liftoff for partnering on this, please go check out liftoff.io. 

That’s perhaps a good place for us to wrap up this amazing webinar. Thank you ladies for just bringing your A-game while being so open and transparent with your wisdom and insights. Thank you so much.

A REQUEST BEFORE YOU GO

I have a very important favor to ask, which as those of you who know me know I don’t do often. If you get any pleasure or inspiration from this episode, could you PLEASE leave a review on your favorite podcasting platform – be it iTunes, Overcast, Spotify or wherever you get your podcast fix. This podcast is very much a labor of love – and each episode takes many many hours to put together. When you write a review, it will not only be a great deal of encouragement to us, but it will also support getting the word out about the Mobile User Acquisition Show.

Constructive criticism and suggestions for improvement are welcome, whether on podcasting platforms – or by email to shamanth at rocketshiphq.com. We read all reviews & I want to make this podcast better.

Thank you – and I look forward to seeing you with the next episode!

WANT TO SCALE PROFITABLY IN A GENERATIVE AI WORLD ?

Get our free newsletter. The Mobile User Acquisition Show is a show by practitioners, for practitioners, featuring insights from the bleeding-edge of growth. Our guests are some of the smartest folks we know that are on the hardest problems in growth.