Incipia blog

Comparing The Accuracy And Utility Of ASO Tools

Gabe Kwakyi | May 24, 2016


Given the ever-increasing costs of mobile advertising and the competitiveness for PR and influencer attention, one of the top priorities of app marketers is to establish a winning app store optimization strategy and maximize their app’s share of high quality, free downloads from keyword and category rankings. Capturing a top 10 rank for keyword searches, or moreover the top 3 position where estimates indicate over 50% of downloads come from, is paramount to app store optimization efforts. While this may seem a daunting task, thankfully app marketers have a slew of tools available with data collected from years and millions of apps to help manage app store optimization and inform keyword research. But amidst choice always arises the question of:
Which (ASO) solution is the best choice?To answer this question, we ran a lightweight test to figure out which app store optimization tool was the most accurate, and while we were at it jotted down notes on what unique features each tool offered for helping to manage an ASO strategy. Here is what we learned:

Test setup

  • Tools tested: Mobile Action, Sensor Tower, Searchman, Appcodes,MobiledevHQ (Tune).
  • Our test used one of our internal iOS apps, which was launched on February 3rd and the data was collected on 2/25/16.
  • 3 keywords were used to analyze the tools’ accuracy (goal tracker, personal goals and goal setting) for both our iPhone and iPad listings.
  • No external marketing was done during the study.
  • Each tool’s search volume estimates were compared to an auto-fill index score, which is based on the number of characters required to populate a keyword in the on-screen auto fill results on an iOS device (we used an iPhone 6). We used this metric to serve as our independent benchmark due to a concern over the accuracy of using Google trends/keyword planner tool statistics to accurately estimate Apple app store search volume.
  • Mobile action’s documentation suggests that, for keywords with a chance percentage over 40% and few competing apps, those keyword have a good chance of hitting a top 10 rank; thus we used 40% as the prediction = yes threshold for Mobile Action, vs 50% for the other tools.
  • 3 points were awarded for 1st place, 2 for 2nd place and 1 for third place for the overall accuracy rankings.

ASO Tool Accuracy

Test 1: Predicting chances of attaining a top 10 rank (soon after the 7-day cliff occurred)

No tool received a perfect score of being able to predict whether our app could attain and sustain a top 10 rank or not for all 3 keywords, across iPhone and iPad. The winner was a tie between Mobile Action and Appcodes. It’s worth noting that Appcodes indicated a 0% chance for 2 keywords, while none of the other tools indicated a zero % chance.
  1. First place: Mobile Action & Appcodes (67% accurate at predicting a top 10 rank)
  2. Third place: Sensor Tower (33% accurate at predicting a top 10 rank)
  3. Fourth place: MobiledevHQ (17% accurate at predicting a top 10 rank)
Searchman was not scored because it does not have a chance/difficulty score; instead it provides a “keyword efficiency index” metric, which factors in the number of competing apps and search volume for a keyword.  

Test 2: Search volume

This phase of the contest proved problematic and yielded no clear winner, for several different reasons:
  • Determining search volume is a very difficult task still managed by guesswork, as Apple hardly provides any help in determining search volume by keywords. Even using Apple’s auto fill is not a surefire, apples-to-apples way to determine search volume as certain letters may have much more volume in actuality than others (e.g. “s” vs “x”), so even though one keyword may appear much sooner than another, it does not necessarily mean that keyword is more popular than the other.
  • While 3 ASO tools (Mobile Action, Sensor Tower and Searchman) each disagreed on the exact search volume score per keyword (Searchman consistently indicated single digit volume for keywords while estimates from Mobile Action and Sensor Tower ranged from 30% to 43%), these 3 tools did have consensus in A) how the keywords ranked in volume from first to third and B) that each keyword did in fact have volume.
  • MobiledevHQ and Appcodes each failed to list a volume score for 1 keyword rendering their records incomplete, yet they indicated that different keywords were low volume.
  • Google data was also contradictory – when it came to determining which keyword had the highest search volume, Google trends and Google keyword planner tool (mobile) volume were out of sync with both one another as well as the ASO tools when it came to ranking the keywords.
Technically, Appcodes and MobiledevHQ both tied for first place in determining the correct order of 2 of 3 keywords, as indicated by the auto fill index score. Mobile Action, Sensor Tower and Searchman tied with 0% accuracy according to the auto fill index score. No points were awarded for this test.  

Test 3: Number of apps returned for a keyword search query:

This was a straightforward and accurate metric to measure, as each keyword search returned fewer than 100 apps total.
  1. First place by a hair was Mobile Action (100% accuracy)
  2. Second place was a tie between Sensor Tower and Appcodes (83% accuracy)
Searchman and MobiledevHQ received an incomplete score as they each failed to provide app counts for 2 keywords.

Test 4: Actual keyword rank

This metric was measured by comparing the actual rank of our app for a keyword, to the rank currently indicated in the ASO tool.
  1. First place: Appcodes (average deviation of 7% vs actual rank)
  2. Second place: Mobile Action (average deviation of 11% vs actual rank
  3. Third place: Searchman (average deviation of 13% vs actual rank)
  4. Fourth Place: Sensor Tower (average deviation of 14% vs actual rank)
MobiledevHQ received an incomplete score for not indicating a rank for 3 of the 6 keyword searches.  

Overall Accuracy results…

  1. First place (tied with 8 points each) Mobile Action & Appcodes
  2. Third place Sensor Tower (4 points)
  3. Fourth place (tied with 1 point each) Searchman & MobiledevHQ

Unique features breakdown by ASO tool

While each tool does have its own set of unique features and functionality, the tools did share some similar, core functionality. Here are the main features common to all tools:
  • Total number of apps returned per keyword query
  • Keyword search volume
  • Keyword difficulty/chance %/efficiency index
  • Reviews/ratings
  • Keyword/category rank history
  • Competitive app tracking
  • App download/revenue
  • Visibility score
  • Estimated number downloads for a given app in the store

Mobile Action:

Mobile action’s core utility is as an ASO optimizations suggestion and competitive/keyword research tool. Unique features include:
  • Actionable suggestions based on app listing
  • 90 day forecasting of downloads/revenue
  • Review analysis to identify common keywords


App Codes’s core utility is as a barebones keyword analysis tool. Unique features include:
  • Scaled promo code distribution
  • Estimate of keywords space keywords per competitor
  • Competitor press mentions (seem to be only from 2013)

Sensor Tower:

Sensor tower’s core utility is as a rankings and keyword/competitive research tool. Unique features include:
  • Keywords space optimization to maximize return within character limits (local and international)
  • Returning both keyword statistics and app results for a keyword search query


Searchman’s core utility is in its diverse and unique set of features related to keyword and competitive analysis. Unique features include:
  • Facebook ads targeting & CPI suggestions based on keywords/competitors
  • Keyword efficiency score (a combination of search volume to competition)
  • Competitive app description comparison
  • Keyword optimization workflow (pulling in keyword ideas from several sources and assisting in optimizing title, keywords space and description)
  • Scaled data requests for keywords, ads and market data via API


MobiledevHQ’s core utility is as a download attribution and rankings insights tool. Unique features include:
  • Most popular category of apps returned for a keyword
  • Organic to paid downloads breakdown
  • Estimation of installs driven by keywords
  • App store macro ranking volatility tracking system (sonar)
  • Competitor is outranking keyword indicator

App Annie:

While App Annie doesn’t provide search volume or a difficulty score and as such wasn’t compared above, it is a common and useful ASO tool. App Annie’s utility is for tracking rank and store stats over time. Some unique features include:
  • Rank change for all returned apps for any keyword search across iPhone, iPad and Android. Though it’s important to note that App Annie does not seem to show apps when the number of returning apps is above ~2,000. App Annie also sometimes does not show any results when there are results in actuality.
  • Historic ranking for keywords
  • Keywords tracking for any app
That’s all for today, folks. Thank you for reading along. Please feel free to reach out with questions, comments, inaccuracies or requests for future topics of interest: hello@incipia Incipia is a full stack mobile agency – we design, develop and market mobile apps for startups and corporations. Visit Incipia for more information.