Mobile-App Advertising Channels/Networks began shifting away from manually-controlled campaigns after 2017, with the transition of most major UA sources to fully ML-algorithm controlled campaigns completed by 2021. While this saved a lot of the time marketers formerly spent diving through settings menus, running thorough analyses of targeting/metric trends, and performing bulk changes, it also complicated and obfuscated the role creatives play in UA. Finding a way to legitimately test creatives can unlock a huge performance optimization lever in any UA strategy.
Every major UA channel offers some semblance of “creative testing” built-in to their advertising ML, but this is a far cry from the pre-ML days of structured, transparent and methodical creative testing tools being standard parts of advertising channels. While we can’t go back to the way Google Search & YouTube App Install Campaigns were before UAC existed, we can acknowledge the limitations of typical approaches to creative testing and take steps to get better creative testing outcomes. This is important so we don’t miss out on the UA performance & creative strategy gains that come from a methodical creative testing framework. This is also important so we don’t waste valuable time and resources on creative production by making false or premature assumptions based on insufficient creative performance data.
For many, “creative testing” in mobile app UA involves simply adding creatives into a campaign and assuming the resulting campaign behavior (read: Machine Learning algorithm’s behavior) is an accurate (and the only possible) representation of creative performance. This is a false assumption, though it is a convenient and straightforward way to interface with the various UA channels/networks available at present. To use the word “testing” alongside “creative” truthfully, would mean you use a repeatable, consistent method to achieve creative test results. After all, in the scientific method – where the whole idea of ‘tests’ comes from – it is essential that tests are repeatable and verifiable, so as to inspire confidence in any conclusions derived from your method. Using a testing method accurately also means you’d be making hypotheses and conclusions for each of your tests. A true A/B test involves one creative vs one other creative, with a single difference between the two. Many times though, tests are structured to include a variety of different creative iterations or concepts. The ideal creative test yields statistically significant results, but this requires more time, effort, and budget than the typical UA manager can spare. This brief, high-level clarification of the difference between the ‘testing’ built into the advertising platforms and the more complicated reality of a true test suggests that there’s more to testing than just throwing new creatives into your main campaign all year long, right?
Managing high-performance UA campaigns across multiple networks, geos and games is very time consuming, and it is commonplace for creative testing to take a backseat to other initiatives. The fastest and most typical way creative testing is done (by adding new creatives to the primary UA campaign for a given game/geo) is also the least truthful, since the ad networks’ ML algorithms so heavily favor incumbent creatives. These same ML algorithms that run UA campaigns are also making predictions about a creative’s performance, often from an insufficient amount of data, simply for the sake of getting back to spending budget on creatives it has a robust dataset for. This typically results in underwhelming creative test results – many times with new creatives failing to receive a significant number of installs or spend at all – and a feeling of pointlessness to maintaining a healthy creative concepting/iteration pipeline. A smaller subset of UA managers or teams do run structured creative tests – either because they’ve specifically hired that capacity into their teams or because their organization has proven valuable results from their testing methodology in the past, preventing it from being deprioritized. Many game studios that either do not have the bandwidth to handle testing internally and/or who value the performance uplift creative testing can yield leverage external partners to assist with creative testing.
Kaizen Ad can support your game’s growth and UA performance through our managed creative testing services. Kaizen creative testing operates independently of your primary UA account and ensures your new creative concepts and iterations get sufficient test data. By partnering with Kaizen to test & optimize your creatives before adding them to your primary UA account, you bolster the effectiveness of every creative you put media spend behind, adding a serious performance optimization lever to your toolkit of UA tactics. Contact us today to learn more about our managed creative testing services to see if they are the right fit for your business!
Comentários