Get the 411 on how to troubleshoot your app store optimization strategy. Image credit to ASO tool, App Annie.
Let's set the scene for a moment: say you've spent weeks or even months pouring over the internet consuming posts/videos/Tweets/GIFs learning about app store optimization. You've made several of passes at optimizing your iOS keywords or running Google Play A/B tests. You've had some successes and some failures, but you're feeling the learn burn as your familiarity with the iTunes Connect dashboard, the Google Play Console and terms like product page views, download velocity and app units grows. Things are going well and you begin to feel you've got a grip on how to move the needle via ASO. Yet, all of a sudden (or more likely following a change), one of the statistics you've become attuned to will alert you to an issue that heralds a disruption in your blossoming ASO strategy. Your impression growth turns to contraction, one or more important keywords loses rank unexpectedly and your conversion rate dips dangerously low.
If this has happened to you, don't worry: we've all been there.
When it comes down to it, learning to manage a *consistent* app store optimization strategy can be filled with frustrating spurts of positive and negative performance and filled with opaque data. Even if you have implemented all the best practices, done your homework and run a series of A/B tests, your performance can still end up going south.
Why does this happen, and how can you prevent it from happening?
Before diving in, let's review a handful of key points regarding ASO that are useful to keep in mind:
- Country/category rankings are mostly straightforward: the faster your app receives downloads or sales, the better your ranking will be.
- Keyword rankings are far more nuanced, which means that how keyword rankings operate can be difficult to pin down:
- Download velocity is still the most important ranking factor, yet downloads must be sourced from searches for a particular keyword in order to boost ranking (though downloads can produce a halo effect for keywords an app ranks for).
- Other factors influencing your app's keyword rank include a mix of many factors, from your app's rating and reviews, to retention (and moreover active user base), to keyword relevance, to competitive click-through-rate and more. That said, the keyword algorithms are the secret sauce of Apple and Google, and it is highly difficult to prove A) which factors assuredly do influence rank and B) to what degree each factor influences rank.
- There are four main factors that cause your rank to change:
- Your app – metadata, performance, marketing efforts, the app itself, etc.
- The competition – i.e. more total apps, the keywords they rank for, their marketing efforts/product, etc.
- App users – the people on the other end of the statistics you monitor. People sometimes react predictively, and sometimes not so much. Over time, people may also shift their preferences.
- Apple and Google – as the owners of the App/Play stores, these two tech companies are the fourth factor that can influence your app's rank, most decidedly independently of all three above factors.
- Key ASO metrics that cannot be ascertained with certainty (as of yet):
- Which keywords drive organic download volume.
- While we can now differentiate app referrers, web referrers, app store browse and app store search, we still cannot differentiate search ads from organic searches, country vs category top charts vs feautre vs specific feature traffic sources.
- All the keywords your app ranks for (unless you pay for a powerful ASO tool).
- Monthly search volume of any given keyword.
- Make sure to keep a record of all of your app's elements over time (e.g. title, keywords, description, etc.), so that you can A) revert changes if necessary and B) identify which changes correlate with changes in performance. While it's very easy to make a lot of changes and it requires much more time to properly track, correlate and draw conclusions, ASO experts know the value of exercising diligence in their methods.
- Keyword optimization is not the only component of ASO that you must focus on. Conversion rate optimization (i.e. turning visitors into downloaders) is also a key piece of your strategy that cannot be ignored.
- After making a change, wait for at least one week before making a new change if possible. You will often find that the initial performance changes over time, due to reasons such as the App/Play store's reliance on trending data, keyword ranks landing initially at one level and rising or falling over time, and because App/Play store data is often delayed by a few days, making the majority of all changes appear negative at first.
- If you are running Google Play experiments (which push live to your Play Store traffic), be aware that your rankings can be negatively affected by your test variants, especially if you're running tests at 50% of your total traffic and your test variants receive a significantly lower conversion rate than your control.
- If you are curious about impression spikes, see our post for 9 possible reasons.
- Growing your overall visibility is important, yet always keep your KPI in mind. If you are growing visibility without growing downloads, active users or sales, then you may need to adjust your strategy.
Why Your ASO Strategy May Become Disrupted
Moving on to the question of why performance may move against your expectations, it's important to acknowledge that there are many, many possible answers to this question beyond what is listed here; but you can use these ideas as a starting place:
- Factors within your control:
- Changes to your app itself or your app listing/metadata – are your changes leading to bugs or better satisfying users? Are you swapping good phrases or high combinatory-value keywords for under-performers? Are your screenshots or description causing more people to become disinterested than interested?
- Your marketing strategy – are you driving fewer downloads, retaining users for a shorter time or confusing users with your messaging?
- The frequency with which you make changes – making changes is good if supported by data, but too many changes can disrupt progress, skew analysis and prevent effects to fully take hold; conversely, too few changes will lead to stagnation.
- Factors outside of your control:
- Algorithm changes – they do happen from time-to-time and can have a big impact on all or a subset of apps.
- Other apps' marketing strategies – are your competitors outpacing you in the algorithm-critical factor of download velocity, are they earning better reviews or have they designed a more appealing app listing?
- Keywords other apps rank for – are more apps starting to rank on your keywords, are they able to obtain higher ranks or are they more relevant for your keywords?
Troubleshoot via Data
While probably not quite like Keanu Reeves in the Matrix, you should become comfortable using data to feel your way through troubleshooting.
Here are a few good statistics to keep in mind, and how to use them:
Impressions/store views – impression volume is the most direct indicator of ASO performance. While country/category/keyword rankings are indicators as well, they are more directly influenced by downloads (which are one step after impressions in terms of ASO). Impressions come before a product page view or download, and as such they touch every user, which means they can tell you the most about the general progress of your ASO strategy. Some users will see your app but not click or download, meaning that your rank would not immediately change, even though your app came into contact with another user.
That said, while impressions precede downloads and are not directly affected by downloads, do keep in mind that impressions are indirectly affected by downloads, as your rankings will change with downloads, thus producing better or worse visibility for your app, and in turn producing more or fewer impressions. Also, for search impressions, your keyword rank will shift some minuscule degree if you don't drive a new click or download from that search impression, as a reflection of your app's inability to perform at those keyword ranks.
If you make a change and see that impressions have risen, it indicates that your app's visibility is on the rise – good job! If they have declined, then you know your change reduced your app's visibility. At that point, if your conversion rate is improving, then it may be good to wait to see whether you can increase downloads even with less overall visibility (e.g. some impressions come from less relevant keywords and losing them isn't as much an issue). But if both your conversion rate (from your install or post-install metric) and impressions both drop and you haven't gained rank for your target keywords, then your change was probably not a good one.
Additionally, be aware that impressions can change often and sporadically (given the four factors affecting rank named above), and oftentimes your change may not necessarily be correlated with changes in impressions. Gather more data, run a formal A/B test or consider only larger shifts in performance to increase your confidence in your findings.
Country/category/keyword rankings – while you can see your app's own installs (and spy on others using ASO tool estimates), country/category ranks are a good indicator as to whether your changes have moved the needle in relative terms. That is, how well your downloads stack up to the competition. By tracking ranks for a few competitors, you can get a sense of whether you are beating the index, and thus whether your results are anything to write home about, or could be chalked up to "market movement."
Be aware, though, that your app may compete with apps on keywords that exist in other categories, meaning that capturing a top rank is good, but may not capture the extent of your competition. This is especially important to consider when running Apple search ads, where apps can more easily cross categories to capture downloads by paying for ads on keywords they are "relevant enough" for, as determined by Apple's search ads algorithm.
Tracking you rank for keywords is one way to get around the opacity of which keywords are driving downloads for your app. If you make a change and see that your target keyword rank has improved, pat yourself on the back; if the opposite occurs or you lose rank on other target keywords, consider reverting your metadata or (if you earned better rank on your most recently targeted keyword) working forward by re-targeting your other keywords that have lost rank.
Be vigilant of the fact that changing your app's keyword mix can alter the relevance and thus your app's rank for keywords. Also, attempting to stuff keywords in order to raise your app's relevance and thus rank can backfire (this can happen often for Android apps in the Google Play Store).
Impression-product page or impression-to-install conversion rate – figuring out whether your app is turning more visitors into deeper visitors (product page view clicks) or even installs (users can bypass your product page and installs from the impression) is a very important part of your CRO (conversion rate optimization) strategy that spans both keyword optimization (making sure the keywords your app ranks for are relevant and producing downloads) and creative optimization (ensuring that your app's listing is appealing enough to convince users to download it).
Product page-install conversion rate – the difference between an impression conversion and a product page conversion is that conversions from your product page bring additional elements of your app's listing into play, including your user reviews, overall rating, screenshots 3+, description and general app information (additionally, for Google, screenshots, feature graphic and short description do not show in a search impression). This means that analyzing changes made to any of these product page-level elements should be done from a PPV-install conversion rate perspective.
Read more about App Store statistics.
A/B testing is crucial for conversion rate optimization, and one of the most important aspects of A/B testing is to be diligent and patient. Pitfalls can abound when you:
- Attempt to speed through testing – make sure to let your A/B tests run for at least a week, especially if you're running more than one variant. In the case of Google Play experiments for example, performance can often start off one way and flip over time, as uninstall/active user data collects. Also, if you're running more variants, the traffic that the variant receives comes from different sources of your total traffic (e.g. keywords), which change over time, meaning that performance can swing day over day by virtue of this fact, rather than the variant's true impact on performance.
- Test too many changes at once – this can lead to the conundrum of not understanding what change caused what outcome in performance (e.g. was it the color, visual, text, symbols, etc. that produced the outcome?).
- Test without a hypothesis in mind – while you can still test for the sake of testing, doing so will leave you without a clear path to proceed and is a bad habit to get into. For example, testing a user review in your description is good, but a hypothesis would be that social proof is important, and the next test should further explore that hypothesis with other forms of social proof, to confirm whether social proof is indeed important to your app's success. Run a test with some idea of what you think will perform best and a reason as to why, which will help you build a deeper understanding of what works or doesn't, and what to try next.
Lastly – consider the fact that negative test results is actually a good sign. This is because when performance is significantly negative, you've found something that has a significant impact on performance overall, and thus by process of elimination you can work your way to a change that produces a positive impact!
If you can't do A/B testing, you can also run a pre-post analysis. While pre-post analyses will produce less confident results and are more prone to problems with noisy data, they are easy to do, require no special set-up and can be useful for directionally identifying whether a change produced a positive result or not, wherein you can determine whether you want to keep the change, revert or run an A/B test to become more confident.
Incipia is a mobile app development and marketing agency that builds and markets apps for companies, with a specialty in high-quality, stable app development and keyword-based marketing strategy, such as App Store Optimization and Apple Search Ads. For post topics, feedback or business inquiries please contact us, or send an inquiry to email@example.com.
- A/B testing
- android development
- app analytics
- app annie
- app marketing
- app promotion
- app store
- app store optimization
- app store search ads
- apple search ads
- aso tools
- coming soon
- facebook ads
- google play
- google play aso
- google play console
- google play optimization
- ios development
- itunes connect
- limit ad tracking
- mobile action
- mobile marketing
- play store
- push notifications
- search ads
- universal campaigns
- user retention
- ux design