Posts /

AWS Lex

Photo by Alex Knight

Twitter Facebook Google+
13 Jun 2017

Chatbots and AWS Lex

If there’s one thing I love, it’s automation. And if there’s one thing I had, it’s dealing with humans.

We are currently in the process of migrating over 300 stores to a new software/hardware solution. Unlike myself, the owners of these stores despise technology and refuse to read their email. Instead, the owners opt to call the helpdesk to ask questions which have already been answered.

I recently found myself in the position of creating a chatbot to automate our helpdesk to assist with this task.

As we are already locked into the AWS platform our architect decreed AWS Lex was to be the chosen solution, without consideration for other chatbot systems. As such I have not investigated other systems and will not be providing comparisons.

Here’s my overall experience and a couple of the gotchas I encountered along the way.

Overview

A bot consists of a couple components which can be tricky to understand in isolation.

Bot

A Bot is a collection of Intents with some configuration. More on Intents below.

Each bot has a set of default responses for errors and hang up phrases, as well as configuration for how many times to retry etc.

Gotcha!!

A bot is configured with a particular voice, so note that it is not simple to change the voice for each user, or each request. This seems to be a commonly requested offering so I’ve noted a couple options for work around on this StackOverflow answer about changing the Lex voice.

Intent

An Intent is the magic. Each intent can be thought of as a conversation with a single goal in mind.

Sample utterances are the conversation openers. Something like: “I would like to book a flight”. These can contain Slots which are described below.

Slot

Slots are parameters you need the user to provide. Such as “I would like to book a flight to Fiji” where Fiji is your destination slot. If a slot is not provided in the sample utterance, the user will be prompted to provide it. A slot has a type which is generated by the machine learning algorithm from a set of examples you provide. For example the destination slot will have a type resembling countries and cities, generated some a sample list you provide.

Gotcha!!

Slots cannot use the built-in AMAZON.YesIntent and AMAZON.NoIntent that Alexa uses. This caused me a great deal of pain, but I found a hacky solution which I’ve put in this StackOverflow answer about Yes/No Intents.

If anyone has a better solution to this. Please add answer to this question.

Lambda Hooks

The Intent can hook into AWS Lambda functions for validation and fulfillment.

The validation hook is an effective way to do mid-conversation processing, to ensure you can successfully fulfill the users request. For example, you might like to check that the destination the user provided is a real location.

Once you’ve collected all this information from the user you need to do something with it. This is the job of the fulfillment hook.

The details of what this should do will entirely depend on your application. For my FAQ bot, it was essentially just providing a response to the user’s query.

There are plenty of example blueprints in AWS Lambda to give you an idea of how to implement these hooks.

Gotcha!!

But there’s no blueprint for Java.

AWS Console UI

Is absolutely awful.

Gotcha!!

I spent an unfathomable amount of time trying to figure out why I couldn’t make any changes to my bot. It turns out that when your bot is built, the state is retained in a version, and you must change the version to Latest in order to make edits.

AWS CLI API

The CLI on the other hand, is brilliant.

The CLI documentation is easy to understand. The commands are conveniently separated for build (Build Documentation) and use (Usage Documentation). And the output from one command flows into the next.

It was trivial to construct a build and deploy script calling this API. Once you have a framework in place, this is far better than using the AWS Console UI.

Android App

Ok, this was pretty cool.

Nothing is more of a wow factor than chatting in real time to your virtual assistant. The best part of this was how simple it was to create.

AWS Mobile Hub is amazing for creating a mock app with your services integrated.

  1. Create a project
  2. Add a Conversation Bot and select your Lex bot
  3. Download the source code for your mobile platform (Android in my case)
  4. Install the app on your device
  5. Showcase your magic!

For an extra bit of wow factor I replaced the app icon, splash icon, style colours and some of the sample text. As I’m familiar with Android, this only took a couple minutes, but really helped push the appeal.

It was incredibly well received. I highly recommend this for any demo of Lex.

Closing

Was I able to create an FAQ bot to replace the helpdesk?

Yes

Would I use this service in future for another bot?

Yes, now that I know how it works.

Would I recommend this service to a friend?

It depends

All of this stuff was great, but you have to be very careful what you are looking to turn into a chatbot. For my particular use case (basically an FAQ), I hardly leveraged machine learning at all. Lex is built to be a conversation bot, so you need to make sure you are programming a conversation.



Thanks for reading

If you enjoyed the content please consider leaving a comment, sharing or hiring me.

Cheers,
Michael


Twitter Facebook Google+
comments powered by Disqus