fbpx

Our guest today is Adam Lovallo, CEO and founder of Thesis, host of the MAU [Talk] podcast, and founder of grow.co. Adam was our first guest on How Things Grow, and his episode on the rise and fall of daily deals was a very interesting behind-the-scenes peek into a roller-coaster of time that he lived through. Today, we are excited to talk to him about a subject that he feels strongly about: incrementality. 

We start with talking about what incrementality testing is, how to identify signs of underlying issues in campaigns, and why Facebook and Google reported numbers can be problematic. Adam then goes on to describe prescriptions on how UA teams can solve incrementality issues, discussing the efficacy and potential outcomes of each. 

Finally, we talk about the role of MMPs, how iOS 14 plays into the picture, the critical importance of channel mix in budgeting and performance decisions, and the importance of seemingly old-school ways to measure performance like blended CPAs or marketing-as-percentage-of-sales. This is a masterclass in understanding the true impact of your marketing, and an unmissable episode for anyone who has questions about how to diagnose issues and implement fixes in their strategy. Enjoy!






ABOUT ADAM: LinkedIn  | Thesis | Grow.co | MAU [Talk]




ABOUT ROCKETSHIP HQ: Website | LinkedIn  | Twitter | YouTube


KEY HIGHLIGHTS

๐Ÿค” What are incremental conversions

๐Ÿšฉ The first signs of an incrementality issue

๐Ÿ’ฅ Results from an incrementality test can be startling to advertisers

๐Ÿ’Š Prescriptions to fix incrementality issues

๐Ÿ”€ Why you should change up your channel mix

๐Ÿ™‹ Self-attributing channels are configured to take credit

โ›” The limitations of MMPs with tracking

๐ŸŒŽ How geographic targeting can generate clean data

๐Ÿงฎ The importance of using blended numbers to determine CAC

KEY QUOTES

What does it mean to have an incrementality problem

If you have an incrementality problem, it means that you’re investing in acquisition channelsโ€”whatever they areโ€”that might appear to be driving conversions, but in fact are just taking credit for activity that would otherwise have happened normally.

An early signal of an incrementality problem

Look at the ratio of conversions that are coming 1-day post click versus 7-day post click. Basically, 1-day post click conversions are much more likely to be accretive than even 7-day post click conversions.

Regular CACs can be misleading

They say: “Our CAC is 70.” Okay, seems pretty good. B2C subscription ecommerce 70? Alright. But I looked at it, and said: “Ooh, a lot of the conversions are coming on off a view-through basis.” Like a LOT. And if you look at the performance solely on a 1-day, click basis, it looked horrible. Horrible, horrible. 

So we ran an incrementality test, and lo and behold, the cost per incremental purchaser was literally $1,000+. So more than 10 timesโ€”13 times or somethingโ€”what they thought they were acquiring customers at.

The last ditch prescription to solve incrementality

If you have that kind of a problem, and you do all the steps, and it’s still looking bad, it’s a channels question. Change the channel mix. If Facebook’s not accretive, or Google’s not accretive, whatever? Do something else. Maybe revisit it in 6 months or something, but you’re trying to push a boulder uphill. It’s just, in my experience, really hard. 

There are radical alternatives too

And then I have a subset of clients, even today, who run the incrementality tests, the results are not so great, so then they choose to ignore the incrementality test and just keep doing what they’re doing. Just like willful ignorance, which honestly, I respect. That’s a very aggressive choice; you got to be real confident to make that call. So that’s another solution that honestly is employed more frequently than you might think.

Some metrics you have to take with a grain of salt

Say: “Well, if I bid for 1-day click then my CACs are going to be terrible.” Well, your CACs are terrible. You just choose to look at the numbers that look better. I cannot emphasize enough: who gives a shit what the Facebook pixels says? It’s meaningless.

Incrementality often exists with multiple channels

The more channels you have running, the more likely you are to run into these kinds of problems. Again, the root problem here is just that the platforms are configured, via their attribution settings, to take credit for the conversions that they drive. And really, they’re configured to take credit for as many conversions as possible.

How to use geographies to get clean data

I think, for an app install advertiser, the main solution is to think about geographies. And say: “Well, I’m going to run these channels on and off; I’m going to run these channels on and off; these channels on and off.” Based onโ€”it could be countries, or even better, it could be zip codes or postal codes at that level, so it’s sort of randomized. And then look at performance based on geographies. 

The right way to approach optimization of performance

If you approach the problem from the perspective of: “I don’t care what the pixels say, I just care about the outcome.” Then you work back. You start with the thing you care about, and then you work back.

Going old school is a viable solution

Certainly looking at blended CPA, or marketing as a percentage of sales is old school, obviously, but that’s a good grounding number to keep in mind, basically, to keep yourself honest, as a performance marketer.

FULL TRANSCRIPT BELOW

Shamanth: I’m very excited to welcome Adam Lovallo to the Mobile User Acquisition Show. Adam, welcome to the show.

Adam: Thank you.

Shamanth: It’s an honor to have you, Adam, certainly because you were our first guest on our other podcast, How Things Growโ€” yours was a hugely popular episodeโ€”you took us down the memory lane of somewhat crazy times. Also, of course, I have looked up to you, every time we’ve met in New York, at MAUโ€”which you run. And, you have a podcast of your own MAU Talk. Today, I would love for us to talk about incrementality, which is something you think a lot about, speak a lot about; and it’s certainly a topic that’s very close to your heart. So for all those reasons, I’m excited to have you here today. 

I know you have seen hundreds of accounts by now, if not more, and when you say you’re looking at an account and they might have an incrementality problem, what does that mean, typically? 

Adam: So generally speaking,

if you have an incrementality problem, it means that you’re investing in acquisition channelsโ€”whatever they areโ€”that might appear to be driving conversions, but in fact are just taking credit for activity that would otherwise have happened normally.

An incremental conversion is exactly that: a conversion that is additional to what would have happened as the baseline. 

It’s accretiveโ€”same sort of terminologyโ€”and so, particularly as businesses get bigger, it becomes harder and harder for paid channels to drive incremental conversions, for a myriad of reasons, but especially because the platforms are very good at taking credit for conversions, and are actually optimizing to take credit for conversions, to drive conversions. And they can sometimes be working against your own interests, if your real objective is to drive incremental conversions. And so it’s definitely a challenge. I see it a lot. And it’s a category – and platform-agnostic problem that can touch basically all types of performance advertisersโ€”or all types of advertisers, really.

Shamanth: Yeah, I like the distinction you made about the real objective being to drive incremental conversions and not just conversions. And I think that it’s an important distinction that can become lost, in even some of the bigger teams, even from publicly traded companies that I have seen, certainly.

Adam: For sure. 

Shamanth: Yeah. And when you’re looking at a lot of these accounts, what are some of the first signs or the most obvious signs that you see that an advertiser or an account might have an incrementality problem? 

Adam: A couple things. If you see that they’re currently considering view-through conversions as conversions, that’s okay. For a very long time, Facebook’s default attribution window was 28-day post click, 1-day post view. More recently, it became 7-day post click, 1-day post view, and soon the default will be 7-day click. But the default method for attribution, in this moment, counts view-through conversionsโ€”which is fine, there’s nothing wrong with that. That said, if you have an advertiser that’s using that default attributionโ€”so they’re considering view-through conversions, maybe they’re even looking at a longer view-through windowโ€”and the view-through conversions constitute a very large percentage of all of the conversions? That’s a sign. So it’s like 50% of the conversions that I’m taking credit for are 1-day view and another 50% are 7-day post click, that’s a little sketchy. So that’s sign number one. 

Sign number two: if you see that retargeting or remarketing, depending on your preferred terminology, is more than 15% of the budget or maybe more than 20% of the budget, that’s another sign. And those two signs in tandem often will lead you down the path to finding an incrementality issue, because it’s a lot of view-through, a lot of retargeting. Similarly, if branded search is a really big “channel”, or place of investment, that’s another telltale sign. Those aren’t the only signals but those are big ones. 

Also,

look at the ratio of conversions that are coming 1-day post click versus 7-day post click. Basically, 1-day post click conversions are much more likely to be accretive than even 7-day post click conversions.

So use some of those signals and if a business is checking each of those boxes, 99% chance that if they were to run a holdout test or incrementality test, it’s going to be pretty devastatingโ€”that’s my terminology. It will blow up their understanding of their own performance numbers.

Shamanth: Yeah. When you say it can be devastating, of all the accounts you’ve seen, what are some of the typical situations when an advertiser has been surprised or shocked by the results?

Adam: I’ll give you a real example. I was talking to somebodyโ€”he’s cool with me sharing this anecdote and I won’t name the brand. Smart dude, smart team, you know, all that.

They say: “Our CAC is 70.” Okay, seems pretty good. B2C subscription ecommerce 70? Alright. But I looked at it, and said: “Ooh, a lot of the conversions are coming on off a view-through basis.” Like a LOT. And if you look at the performance solely on a 1-day, click basis, it looked horrible. Horrible, horrible. 

So we ran an incrementality test, and lo and behold, the cost per incremental purchaser was literally $1,000+. So more than 10 timesโ€”13 times or somethingโ€”what they thought they were acquiring customers at.

So to me, that’s pretty devastating. Basically, for that business, you could turn this entire channel off, it’s not going to do anything. It’s not doing anything. 

So that is 10x; that’s pretty bad. What you want to see for a good account: runs an incrementality test, maybe the incremental CACs are 30 or 50% higher than what the pixels’ reporting? That’s okay, you can swallow that. That’s alright. But you start pushing 5 or 10x, it means that every decision that you’ve ever made with respect to budgeting, planning and acquisition, all that stuff, was wildly off. That’s pretty bad.

Shamanth: Yeah, and that can upend the fundamentals of their business economics. 

Adam: Absolutely.

Shamanth: What do they do from there? 

Adam: There are some prescriptions that are straightforward. Okay, well, we’re gonna change the attribution window. Great. You can run a testโ€”Facebook will set up such a test with their repsโ€”and say: “Alright, well, what if I optimize for 1-day click? What about 7-day click? Default? Whatever. What’s my incremental CAC, based on my different optimization windows?” 

And maybe just by changing the optimization window, you solve the problem. Okay. These are not mutually exclusive solutions, mind you, but you can say: “Alright, well, maybe I just need to tighten up my exclusions. If I’m not being as aggressive as I can be in terms of excluding audiences, maybe I could do better there.” By the way, going back to your prior question, that’d be another telltale sign. If your audience exclusions are weak, non-comprehensive, that’s a bad sign for prospecting. So, okay, I take my exclusions, I find a different optimization window. To some extent, it can be a budget allocation question. Well, what if I just make prospecting 90% of my budget instead of 50% of my budget? I dial my remarketing targeting down. 

Those are easy prescriptions; testable, click some buttons, you’re good to go. Or in the case of branded search, maybe I try pausing branded search for some time. If you do all of those thingsโ€”I have clients and I’ve spoken to people who’ve done all of those arguably easy prescriptionsโ€”and you still have an incrementality problem, then it’s a bit more esoteric. Well shoot, what do I do? 

I’ve seen a couple solutions; I don’t think any of these solutions are that great. The first is, well, I’m going to create creative that is more likely to attract an incremental audience, as opposed to my existing audience. Easier said than done. I don’t know how you do that. I have absolutely no idea how you would go about that, so that solution never really spoke to me, frankly. But I’ve heard people say: “Yeah, well, we’re gonna try to address the creative.” Okay. 

Secondly, I’ve heard people say: “Well, we’re optimizing our campaigns for maybe purchase, or if it’s an app, some sort of in-app event. We’re just going to try a more classical approach to media buying: 30% of the budget is going to be reach and frequency; 30% of the budget is going to be optimizing for video views. We’re basically going to take our campaign that used to be purely purchase or app install or something, optimize and focus, and we’re going to have more of a mix of strategies, with the rationale being the purchase optimization is not driving incremental purchasers, so maybe I just need to hit different pockets of the audience. Because Facebook’s only optimizing for these purchasers, well, what if I optimize for other brand lift metrics and stuff?” 

Not a crazy prescription. Also, not exactly my favorite, to be honest, in that, that comes with its own pitfalls. Now, you’re running video view campaigns and you’re hitting subsets of the population that just suckโ€”perfectly nice people, no doubtโ€”but they’re literally less valuable as evidenced by the CPMs. The CPMs for audiences are like the auction of advertisers deciding what the impressions are worth. So, lo and behold, you start optimizing for video views, the CPM is going to $2 from $20; they’re 1/10 as valuable for the entire auction. So that’s problematic. You trade one problem of incrementality for another problem with, potentially at least, waste. 

There’s no real easy solution. The only other thing I can call out is, Facebook had this beta functionality to do incremental purchase optimization. So you go in the ad set, instead of clicking purchase, you click incremental purchase. And my understanding is they were using a different model to do the optimization. I think that’s going away because of all of the iOS 14 stuff. We have one client using that functionality, and they told us to stop using it. So I’m not sure if this incremental optimization stuff is the real solution. 

So I think the best prescription is toโ€”

if you have that kind of a problem, and you do all the steps, and it’s still looking bad, it’s a channels question. Change the channel mix. If Facebook’s not accretive, or Google’s not accretive, whatever? Do something else. Maybe revisit it in 6 months or something, but you’re trying to push a boulder uphill. It’s just, in my experience, really hard. 

And then I have a subset of clients, even today, who run the incrementality tests, the results are not so great, so then they choose to ignore the incrementality test and just keep doing what they’re doing. Just like willful ignorance, which honestly, I respect. That’s a very aggressive choice; you got to be real confident to make that call. So that’s another solution that honestly is employed more frequently than you might think.

Shamanth: Yeah, I have certainly seen that last solution, like I said, certainly in a publicly traded company, I have seen that, among others. But what you’re saying underscores that there are some low hanging fruit that absolutely can be fixed, but maybe there’s an underlying problem or challenge with the channels themselves. 

Speaking of some of the lower hanging fruit, so you’re like: “Look, if a lot of the conversions are 1-day click, then it’s very likely to be accretive.” One fix would just be to optimize for 1-day click or maybe even 7-day click, and make that happen. One risk with that could be the CPMs might go up. Because you are by definition, bidding for more targeted usersโ€”and the algorithm is picking up on the fact that these are more valuable usersโ€”so if an advertiser has a reservation about this and says: “Look, I could bid for 1-day click, but that’s going to kill my costs.” Among the companies that you advise, if that comes up, how do you typically address that?

Adam: It’s a straightforward problem. Let’s assume you run an incrementality test and the results weren’t great. That’s why we’re having this conversation.

Say: “Well, if I bid for 1-day click then my CACs are going to be terrible.” Well, your CACs are terrible. You just choose to look at the numbers that look better. I cannot emphasize enough: who gives a shit what the Facebook pixels says? It’s meaningless.

You really only care about the marginal cost or the incremental cost to acquire customers. That’s how you can make a decision, a budgeting decision. So they say: “Oh no, the CPMs are low.” Who cares? Right now, you’re spending money at $1,000 incremental CAC so you might as well just be setting it on fire. 

The better argument is, well, if the media costs get so expensive that even still you can’t drive incremental sales efficiently, it’s no longer an incrementality problem. Now, it’s just a regular old performance problem. Tough luck. You’ve got to change things up, you’ve got to look at the channel, mix things up, do the usual user acquisition stuff. But it’s an idiotic argument, basically; particularly in the face of an incrementality result to say, “Yeah, but if I change it, it’s gonna be bad.” Dude, it’s already bad.

Shamanth: Yeah. I see. Is this more of a problem when you are on multiple channels than on just Facebook? Or is that not an accurate understanding?

Adam: Well yeah, I think the more channels you have in the mixโ€”particularly digital channels that auto-optimize things for purchases based on their own respective attribution windowsโ€”the probability that there’s an incrementality challenge is higher, just because there’s multiple channels taking credit, and multiple channels optimizing for the same purchase. 

For instance, if you have a business, and you only run Snapchat. You spend a million dollars in Snapchat, that’s the only thing you do. You do literally nothing else, you drive $2 million a month in sales. Okay, you could probably reasonably say that those $2 million are a function of Snapchat, and maybe Snapchat’s taking a little bit of credit for press and direct and organic search and stuff. But you know, it’s fine. 

If you then simultaneously add a million dollars in Facebook spend, and it appears based on the pixel to be doing really, really well but then your overall business only goes up by $100,000 in sales, you’ll say: “Oh, shoot, now we’ve got an incrementality problem.” 

So I’d say, yeah,

the more channels you have running, the more likely you are to run into these kinds of problems. Again, the root problem here is just that the platforms are configured, via their attribution settings, to take credit for the conversions that they drive. And really, they’re configured to take credit for as many conversions as possible.

That’s what the systems are built to do, because that’s how they get paid. Well, that’s certainly how they get evaluated. So if you have lots of lots and lots and lots and lots of systems doing exactly that, lo and behold, they’re all trying to get in front of that one person, who they know is going to buy because they visited the site a day ago or whatever. So I’d say it’s more of a problem. 

That said, I definitely have seen businesses that really have only one paid channel, but they have a brand, they’ve got direct traffic, they’ve got organic, they’ve got ASO stuff. They have other sources, but they really only have one paid source. And you might say, oh, well, that’s probably not an incrementality problem there. And I’ve even seen instances where that is not the case, where there is a major incrementality problem, and they just don’t realize it. So it’s not like it only applies to companies doing lots of stuff or big companies; it can be a major problem for even medium-sized companies. I’d say by definition, it’s not a problem for small companies, logically. But yeah, the more channels, probably the more likely you’re going to run into problems.

Shamanth: Yeah. For somebody that is running multiple channels, and they’re like, look, let’s use MMPs, if they’re mobile, or Google Analytics, if they are web, to mediate the channels. Is there a way that could help them understand which channels are performing better or do you feel like that’s not likely to happen? 

Adam: A couple disclaimers: I like all the MMP businesses. I think they’re great. I think they’ve done a really good job in responding to this iOS 14 stuff, and SKAdNetwork and dealing with it for advertisers. This is not a knock on MMPs. But MMPs have some major limitations, as it’s a function of the platform. 

Number one: they literally can only track people who click and install and stuff. There’s no view-through, nothing. That’s the limitation imposed on them by the channels, and by Apple and Google. I’m sure if they could track all that stuff they would gladly. So you’re already willfully ignoringโ€”when you’re using MMP trackingโ€”any value of impressions that do not translate to clicks and installs in that moment. You’re saying: “I don’t consider those; they have no value.” Which, obviously is not the case, practically speaking. That’s the number one challenge. 

Number two: you’re essentially doing last click attribution with the average MMP platform. That’s the basic logic. And yeah, there are self attributing channels like Facebook and Google that have their own logic, which is largely unknown to us. And yes, I know that some of the MMPs have their own “multitouch weighting” solutions, where essentially, you can factor in all of the clicksโ€”again, only clicks are being consideredโ€”but nonetheless, all of the clicks and say, oh, well, there’s this click, and then that click, and then that click, etc. 

So it’s not bad, I suppose. I don’t think they’re harmful in looking at these problems, but at the end of the day, so much of the app install volume is coming through self attributing platforms that we don’tโ€”as app install advertisersโ€”really know that much about what’s going on. It’s just whatever Google and Facebook says is going on is what we’re getting credit for. So I don’t think the MMPs have any super obvious solution to me. 

What I think you’re seeing more in the app install ecosystem is people talking about: “Oh, well, you’ve got to do media mix modeling at a more elevated level.” Media mix modeling and incrementality testing are kind of the same thing. A media mix model is trying to establish what’s the right media mix that generates the best outcome at the channel level, and incrementality tests basically are trying to prove out what’s the right level for each channel that generates the best outcome. Slightly different approach, but really, fundamentally, they’re in the same school of thought. So that, I think, is a more likely solution/way forward on these sorts of problems. 

Maybe the MMPs start to build media mix modeling/forecasting functionality into their tools? I think that would be a logical product thing. What-if scenario analysis: if I spent this much on this platform and this much on this platform, what do I expect the outcome to be? And then they actually run a test and prove that. I think that’d be cool. But yeah, the fact that they can only attribute clicks and installs and stuff, and they’re ignoring the value of views entirely and engagements entirely makes them, I think, a little less well positioned to take on some of this incrementality business.

Shamanth: Yeah, and of course, with SKAdNetwork, it certainly becomes a lot trickier.

Adam: Yeah, you have even less visibility than you had before. So that’s already blowing it up. Really,

I think, for an app install advertiser, the main solution is to think about geographies. And say: “Well, I’m going to run these channels on and off; I’m going to run these channels on and off; these channels on and off.” Based onโ€”it could be countries, or even better, it could be zip codes or postal codes at that level, so it’s sort of randomized. And then look at performance based on geographies. 

That’s how holdout testing has been done historically, in channels like radio and TV, and largely thanks to all of this iOS 14-IDFA business, that sort of geographic holdout testing is going to be more popular. It’s going to be more practical, because, for instance, Facebook is saying: “Hey, if it’s an opted out user? They’re on Facebook and Instagram, they opt out of the ATT prompt thing, we’re not going to even include them in our lift test.” So there you go, 50-60% of the population, they’re gone. You can’t run this sort of measurement. But you could always say: “California, New York, I’m going to turn off search; in Ohio and Pennsylvania, I’m going to turn off Facebook.” And whatever you want. And again, the zip codes will probably be smarter, but for the sake of argument, you can run that test across all platforms, any time of the day, app installed, don’t care. As long as you can report on results at that same geographic level, you’ll get real clean data out of it. So that I think will be more common than it is today; that’s very rarely done today in digital. That’s probably the most ideal solution, given the constraints that are coming in now.

Shamanth: Sure, and something else that you have also written about is looking at the blended numbers because that’s literally a source of truth and using that to guide all decision-making.

Adam:

If you approach the problem from the perspective of: “I don’t care what the pixels say, I just care about the outcome.” Then you work back. You start with the thing you care about, and then you work back.

You’re trying to come up with a rules-based system that says: “Alright, well, I want my blended cost of acquisition to be $100, and it seems like when I do a $50 CAC in Facebook and a $20 CAC in Google and a $200 CAC in Snapchat, it seems like my blended number comes out to be 100. And that’s what I wanted anyways, and now I’m gonna change it. What if I make my Facebook number 70 and my Snapchat number 150, whatever.” 

That is very simplistic approach, mind you, but does keep you honest and so you don’t find yourself in a position where: “Oh, well, my CPIs are $3, my cost of acquiring a paid player is $20 and yet the business sucks overall.” If you’ve been keeping your eye on the numbers you actually cared about the whole time, you probably would have identified a problem sooner versus you could keep running, according to what the Facebook SDK says, until the end of time, and you keep hitting your $20 per new player number. 

So

certainly looking at blended CPA, or marketing as a percentage of sales is old school, obviously, but that’s a good grounding number to keep in mind, basically, to keep yourself honest, as a performance marketer.

Shamanth: Yeah, that’s certainly something we are advocating in the uncertainty of the iOS 14 changes, until we sort out what’s happening, let’s look at the blended numbers. Let’s see. 

Adam: I think especially for the app install world, you know, the SKAdNetwork’s like: “You can track people for 24 hours and whatever.” Instantly from the report, a lot of the attributed conversions are gone. So you just have to look at the blended and then again, establish your business rules, which say: “Well, okay, used to be that when I had a $5 CPI, it worked out. Now if I have a $7 CPI, it’s okay, as long as blah, blah, blah.” Sort of reset all of those guideposts. I think that’s a practical reality.

Shamanth: Yeah, certainly. So much food for though and as it has been every time I’ve spoken to you, Adam, I’ve learned a ton. This is certainly something I will forward to our team. And of course, I’m excited to post this on the podcast. This is perhaps a good place for us to wrap. Before we do that, can you tell folks how they can find out about you and everything you do?

Adam: The easiest path is I have a weekly newsletter, which is grow.co. I’ve written it for many, many, many years. In that newsletter, we will promote the MAU events, which are our in-person events for the mobile app install ecosystem. We’ll promote our MAU [Talk] podcast, which is my podcast in the app install ecosystem. And then, in my day job, I have my own growth agency. I have a team team of 60 or 70, we’re working on lots of campaigns, we’re called Thesis. And I write a lot about attribution and tracking and all the stuff we’ve basically just discussed. And I include those blog posts in that grow.co newsletter every week. So that’s probably the easiest way to track me down if anybody wants to.

Shamanth: Wonderful, and we will link to all of that in the show notes and the highlights. But for now, thank you so much, Adam, for being on the Mobile User Acquisition Show.

Adam: Oh, my pleasure.

A REQUEST BEFORE YOU GO

I have a very important favor to ask, which as those of you who know me know I donโ€™t do often. If you get any pleasure or inspiration from this episode, could you PLEASE leave a review on your favorite podcasting platform โ€“ be itย iTunes, Overcast, Spotify or wherever you get your podcast fix. This podcast is very much a labor of love – and each episode takes many many hours to put together. When you write a review, it will not only be a great deal of encouragement to us, but it will also support getting the word out about the Mobile User Acquisition Show.

Constructive criticism and suggestions for improvement are welcome, whether on podcasting platforms โ€“ or by email to shamanth at rocketshiphq.com. We read all reviews & I want to make this podcast better.

Thank you โ€“ and I look forward to seeing you with the next episode!

WANT TO SCALE PROFITABLY IN A POST IDENTIFIER WORLD?

Get our free newsletter. The Mobile User Acquisition Show is a show by practitioners, for practitioners, featuring insights from the bleeding-edge of growth. Our guests are some of the smartest folks we know that are on the hardest problems in growth.