Talking Covid-19 with Amazon Lex

A brief introduction to Amazon Lex

Over the last few days I have had a chance to play with Amazon Lex-, and used the time to make a simple bot for the current hot topic – Covid 19 statistics. It was an opportunity to play with Lex, and to get reacquainted with AWS Serverless Application Model (SAM) – With a dash of CloudFormation for additional wiring, and spinning up a web stack from Amazon’s example Web UI –, I was able to have a bot running within a day.

Anatomy of a ChatBot

We want the process for using a ChatBot to be simple for the end user; it should be like a conversation. I want to know something, so I ask in natural language and hopefully get a sensible reply. 

This is where we rely on Lex. Lex allows us to configure Intents, which are actions we would like our bot to take. These can be a simple static reply, or a more complex function which needs to call out for fulfillment, e.g., to a Lambda. These intents can have Slots, which are pieces of information from the conversation, which are necessary (or optional) to fulfill the request. In this example, we need to know what country the user is asking about.

Architecture of a ChatBot


I started playing around with Lex in the console – At present, there is no CloudFormation option for it, but you can use custom resources and a Lex schema to create one. An example of this is present in the aws-lex-web-ui example – You can also export the schema from the console, which could be a way to migrate from a spike to a more robust CI/CD workflow for the product.

Intent Detection

So how does Lex understand what the user is asking about? We need to give it some examples. For example: 

– How are things going in Australia?

– What’s the state of the Virus in Singapore?

– Tell me about Japan?

An interesting note, Lex doesn’t like sentences ending in question marks – so we need to remove those for the bot to build: 

– How are things going in Australia

– What’s the state of the Virus in Singapore

– Tell me about Japan

With the conversational logs set to go to CloudWatch, we can review what users are asking the bot, and add more examples over time as needed, or even new functionality if we see a demand for things. 

With the examples set, and a quick test in the console to confirm it works as expected, we go onto adding a slot for country.

Slot Validation

First, we need to decide what information we want. In this case, the Country. There are two options for validating slots – Amazon built-in types, or a custom type that calls a Lambda for validation. In this case, I’ve used the Amazon.COUNTRY slot type to validate that the user is asking about a Country. 

We can update our examples to show where the slot should be – 

– How are things going in {Country}

– What’s the state of the Virus in {Country}

– Tell me about {Country}

I have noticed that this can be a little buggy – for example, asking “What’s the latest info on the US” results in “the US” being pulled out as a Country. We could potentially add more custom validation here, or handle things in our fulfillment Lambda.


If you just need to provide a static response – e.g. direct the user to some information on your website, or provide them with some help around what the bot can do, you can define those responses in the console. 

In this case, we need to go off to a remote data source, process that, and retrieve a subset of it and present it to the user  – so we need to use a Lambda. We can create a Lambda with AWS SAM, using the hello-world template and modifying it to work with a Lex Event – and perform the processing we need to fulfill the request.


def lambda_handler(event, context):

    # Extract the Country slot from the event

    country = event[‘currentIntent’][‘slots’][‘Country’]

    print(“Getting data for {0}”.format(country))

    # Go off and get the data

    data = get_country_data()

    if data is None:

        # It’s important to make sure your error messages are clear to the user.

        response_content = “Sorry, I am having trouble getting data at the moment. Please try again later”


        country_data = data.get(country)

        if country_data is None:

            # This let’s a user see if the country they’ve provided has been interpreted incorrectly by Lex,

            # or if we just don’t have the data for that country. 

            # It would also be an option to do some further processing and see if we can figure it out. 

            response_content = “Sorry, I am not able to find any information on ‘{0}'”.format(country)


            # We know the data source has most recent data as the last object in the array, so we grab the last item.

            latest_data = country_data[-1]

            response_content = “Here’s the latest data for {0} from {1}. ” \

                               “There are {2:,} Confirmed cases, with {3:,} deaths, and {4:,} recovered.” \






    # Here we prepare the full response body to return to Lex

    response = {

        “dialogAction”: {

            “type”: “Close”,

            “fulfillmentState”: “Fulfilled”,

            “message”: {

                “contentType”: “PlainText”,

                “content”: response_content




    # And we’re done!

    return response


Slack Integration

AWS offer a few channels with which you can access your bot – Slack is one of them and is described here Integrating an Amazon Lex Bot with Slack – Amazon Lex –

There seemed to be an issue following the steps in the above, which was resolved by in this – thread. 

Once integrated, you can interact with it over Slack:

Here we can see an example of bad error handling – there was an unhandled exception which threw an error back to the client. A further step could be to wrap the entire processing block into a try-catch block to always return a sensible response to the end user.  

For more information on Chatbot integration with Chime and Slack read Steve Mactaggart’s recent post here.

Website Integration

As I mentioned in the introduction, I used this AWS guide – to set up a web interface. The only change I made to the default parameters was to point it to my existing bot, and to keep the bot if the stack is deleted. You could use this repository as a base for building your own CI/CD as it supports deploying Lex via a schema. 

End Product

We now have a bot that can interact with users over multiple channels. While my work so far has been with text based channels, Lex also supports voice out of the box and can be plugged into Amazon Connect – for dealing with incoming calls. 

Do you need something a bit more intelligent than an Interactive Voice Response (IVR) to help route your incoming calls? Lex might be able to help.

Enjoyed this blog?

Share it with your network!

Move faster with confidence