AlexaDev Tuesday – Check Those Alexa Skill Metrics: It’s Easy!

Publication Date: 6/12/17 – In the beginning it was a pain in the neck to keep on top of your Alexa skill metrics. Now that Amazon provides easy tools right in the developer portal, there’s no reason not to keep on top of those metrics, and let them inform your skill improvement and development decisions.


Be Data Driven


There’s a direct link to view metrics for each of your live skills right there on the dashboard:


Alexa Skill Metrics


You can view metrics on unique customers, sessions and utterances, both in summary and detail form.


What Metrics Can You See?
The Overview tab gives a great at-a-glance view of how your skill is performing in general. I shared an Overview screenshot from my Bingo skill a few posts back, but it’s worth bringing it back here for illustrative purposes:



How Does Any Of This Help Developers?
Mainly, it has to do with finding out how your skill is performing and how successfully people are interacting with it. You may think you covered all the bases during your test phase, but your skill metrics may tell a different tale. If they do, it’s important for you to know that so you can release an update if necessary, and so you can avoid repeating the same mistakes in future skill releases. It’s not shown in the screenshot above, but there are drill-down tabs for Customers, Sessions, Utterances and Intents.

The Customers drill-down tab is basically a headcount.

In the Sessions drill-down tab, you can view total sessions vs. successfully ended vs. ‘user did not respond’ ended, as well as a failed session count. Failed is obviously the most important metric here. Remember that anytime the session ends due to the default, 8 second timeout, that counts as a “failed” session. If you’re seeing a lot of failed sessions, check your reprompt logic to ensure the user isn’t left hanging, and is given another chance to respond before that timeout hits.

In the Utterances drill-down tab, you can view successful utterances vs. failed utterances, and ‘no user response’ count. This will tell you how much ease or difficulty users are having interacting with your skill. ‘No user response’ may just mean the user purposely chose to stop interacting with the skill, but if you’re seeing a high count there you’ll want to go back over your interaction model to make sure there’s a logical path for the user to follow at each stage of the interaction.

In the Intents drill-down tab, you can view successful vs. failed intents, which also speaks to the user interaction experience. Failed intents are generally things the user said that didn’t match up to anything in your Intents and Utterances model. Given that users can of course say literally anything to Alexa, and as of yet there’s no catch-all “no matching intent” response for developers to implement, this metric isn’t as easy to map to possible improvements. If you’re seeing lots of failed intents about the best you can do is go back over your Intents and Utterances and try to fill it out with more variations on the words and phrases you’ve already got there.


Alexa skill metrics are free and easy to use, and they can improve the quality of your skills. If you’ve got published skills, start checking those metrics!