AlexaDev Tuesday – Guest Post From Dustin Coates: All About Alexa Skill Localization

Today we take a break from the deep dive series about my Visual Tarot skill to share a guest post from voice developer Dustin Coates. If you’ve got questions or confusion about Alexa Skill Localization, this post should help clarify things for you.

 

International Doge

 

India, Australia, New Zealand, Ireland… who knows where next? Alexa is reaching more locations, giving Alexa skill developers more opportunity to extend their reach. Moving quickly, developers can have an advantage by publishing their skills in new locales first. For locales like India, you could simply take your existing English-language skill and open it up to a new location. This, after all, is what Amazon did to launch in India with 11,000 skills. But in countries where Alexa is in another language, this isn’t enough. Even in English, tailoring your skill to include local words or ways of speaking will provide a more engaging experience for your users.

The Alexa Skills Kit SDK for Node.JS has just the tooling you need to localize your skill. It comes down to a three step process: create your localized strings in an object with a special format, register the strings, and call the translation method. We’ll look at them one-by-one.

The strings object is a collection of strings grouped by locale. In this object you’ll place the responses you want Alexa to say. Grouping all of them together provides localization as well as good code organization, because you won’t need to hunt all over your code when you want to change a response. The object takes the form of:

const languageStrings = {
“en-US”: {
translation: {
Weight: “An average adult male elephant weighs 9,000 pounds.”
}
},
“en-GB”: {
translation: {
Weight: “An average adult male elephant weighs over 6,400 stone.”
}
}
};

The top level keys are the locale strings you want to support, with the inner-most values being a key-value pair. (Why that in-between “translation” key? The ASK SDK uses i18next which has its own syntax. In this situation, we’re just along for the ride.) This string object must be registered before being used, by setting it as “resources” off of the alexa object.

exports.handler = function(event, context) {
const alexa = Alexa.handler(event, context);
alexa.resources = languageStrings;
alexa.registerHandlers(handlers);
alexa.execute();
};

Now, your strings are ready to be used in your handlers. There’s a little bit of “magic” here, and it all comes about through the method t. Provide it the key of the localized string you want.

const handlers = {
GetWeightIntent () {
this.response.speak(this.t(“Weight”));
this.emit(“:responseReady”);
}
};

Now, whenever the GetWeightIntent is triggered, a different response will be provided depending on where the user is–set by the SDK based on the user’s preferences. You might be wondering, then: what if I don’t have a string for a certain locale?

i18next supports fallback strings. By providing an array, with the first item the key to find in the strings object and the second item the fallback value. If the locale-key pair isn’t found, i18next will return the fallback instead.

this.response.speak(this.t([“Weight”, “An average adult male elephant weighs 4,082 kilograms.”]));

You’re well on your way to localized Alexa skills, but there’s one more, undocumented trick. The information we’ve used so far is static. What we’ve seen so far wouldn’t work very well if you were grabbing information from the user request or from an API and returning that with a localized string. For this, turn to interpolation.

const languageStrings = {
“en-US”: {
translation: {
HumanWeight: “You weigh {{weight}} pounds.”
}
}
};

const handlers = {
GetWeightIntent () {
this.response.speak(this.t(“HumanWeight”, {weight: 165}));
this.emit(“:responseReady”);
}
};

In this setup, a double-braced reference is set in the pre-defined string. It is provided a value by a second argument to t, an object with key that match the references and values that are what you ultimately want in the string. If you prefer sprintf (e.g. “You weigh %d pounds.”), the SDK also includes the i18next sprintf post processor.

 

With these localization tools, you can make sure Alexa is speaking the language of your users no matter where they are, whether they say “Howdy,” “G’day,” or “Guttentag.”

 

* * *

Dustin Coates’s Voice Applications for Alexa and Google Assistant teaches you how to design, build, and share voice apps. Michael Jensen says, “This book is a great introduction to Alexa development with step by step examples for Skill development. If you are looking to get started with creating skills for Alexa, this is the book for you.”

Voice Apps Coates

Advertisements make it possible for Love My Echo to bring you great content for free, so thanks for your support.

* * *

 

Tags: , , , , , , , , ,

Top