AWS Machine Learning Blog

Deploy a Web UI for Your Chatbot 

December 2023: Post was updated with introduction of streaming capability – See latest releases in Github
September 2021: Post was updated with introduction of Transfer to Amazon Connect live chat

You’ve built a very cool chatbot using Amazon Lex. You’ve tested it using the Amazon Lex console. Now you’re ready to deploy it on your website.

Although you could build your own bot user interface (UI), that seems like a lot to take on. You’d need to handle support for different devices and browsers, authentication, voice recording, and much more. You figure that someone must have done this before, and that with luck you’ll find a solution that you can re-use.

The chatbot UI

Our sample Amazon Lex Web UI, referred to as the chatbot UI, already does most of the heavy lifting associated with providing a full-featured web client for Amazon Lex chatbots. You can quickly exploit its features, and minimize time to value for your chatbot-powered applications.

You can run it as a full-page chatbot UI:

Or embed it into a site as a chatbot widget:

The chatbot UI supports the following features:

  • Works with Lex or Lex V2 bots
  • Mobile-ready responsive UI with full screen or embeddable widget modes
  • Full support for voice and text, with the ability to seamlessly toggle between them
  • Voice features including automatic silence detection, transcriptions, audio record and replay, and the ability to interrupt Amazon Lex response playback
  • Response cards support for both text and voice
  • Ability to programmatically interact with the chatbot UI from the hosting site
  • Multiple deployment options
  • Web accessible via Amazon Cloudfront
  • Fully integrated user login via Amazon Cognito User Pool – user token accessible by Lex bot as a session attribute. Login can be optional or required.
  • Markdown support for rich text / images / video, etc.
  • Support for clickable buttons in Lex response cards
  • Optional ‘Thumbs Up’ and ‘Thumbs Down’ buttons – sends response feedback message to Lex bot
  • Optional ‘Help’ button on title bar – sends help message to Lex bot
  • Resend any previous message
  • Transfer to Amazon Connect live chat – allows users of the Lex Web UI to request and conduct live chat conversations with a human agent using Amazon Connect. Here’s a short demo video. For more information on how to configure and use this feature, see the GitHub README.
  • New! Streaming support – allows your Lex bot to stream data back to the client from a long-running Lambda fulfillment request using an out-of-band web socket connection. Can be used to enable LLM streaming conversations back to the UI or to report progress on longer API processes. For more information on how to use the feature and for an example Lambda function using LLM streaming, see the GitHub README.

Deployment and integration options

You have three options for deploying and integrating the chatbot UI:

  1. Use AWS CloudFormation.
  2. Use a prebuilt distribution library.
  3. Use a prepackaged Vue component.

Deploying with AWS CloudFormation is the easiest method, so we’ll show you how. For more information on all options, see the GitHub README.

Getting started with the AWS CloudFormation deployment option

Deploy the chatbot UI to kick the tires.

  1. Press the Launch Stack button for the Region in which you will use your chatbot:
    Northern Virginia
    Oregon
    Ireland
    Sydney
    Singapore
    London
    Tokyo
    Frankfurt
  2. Accept all default parameters.
    This deploys a demonstration environment in your account (in the us-east-1 AWS Region) and installs the OrderFlowersBot.
  3. After AWS CloudFormation launches the stack (the status is CREATE_COMPLETE), open the Outputs tab. Choose WebAppUrl or ParentPageUrl to experiment with the chatbot UI:

The stack deploys the following architecture. It is completely serverless ˗ charges are based on your service usage.

For more information, see CloudFormation Stack README.

After you’ve got a sense for how the chatbot UI works, try it with your own Amazon Lex chatbot and integrate it into your website. Here’s how:

  1. Press the Launch Stack button for the Region in which you will use your chatbot:
    Northern Virginia
    Oregon
    Ireland
    Sydney
    Singapore
    London
    Tokyo
    Frankfurt
  2. In the Lex Bot Configuration Parameters section:
    • For BotName, type your bot’s name.
    • Set EnableCognitoLogin to true to enable integrated user login.
  3. In the Web Application Parameters section, complete each of the parameters.
    Note: It’s essential that you use your site’s origin for WebAppParentOrigin.
  4. After AWS CloudFormation launches the stack (the status is CREATE_COMPLETE), you will see a link on the Outputs tab in the SnippetUrl output value:
  5. Browse to the SnippetUrl page, where you will see a code snippet similar to the following that you can paste into your application:

For code examples and additional options for integrating the chatbot UI into your website and configuring it, see our Amazon Lex UI repository on GitHub.

Upgrading

You can use CloudFormation to upgrade your existing Lex Web UI stack, or to change the value of any of the stack parameters.

  1. On the CloudFormation console, select the main Lex Web UI stack.
  2. Choose Update.
  3. Choose Replace current template.
  4. Enter Amazon S3 URL for your region:
  5. Choose Next three time to display the Review page.
  6. Select the acknowledgement checkboxes, and choose Update Stack to upgrade Lex Web UI.

Let us know what you think

We hope you try the sample chatbot UI. Let us know about your experience in the comments section at the end of this post.

We’d also love to hear your suggestions for improvements and features. Report bugs and submit feature requests in the project GitHub repository. Even better, contribute your enhancements as pull requests!


Additional Reading


About the Authors

Austin Johnson is a Solutions Architect, helping to maintain the Lex Web UI open source library.

Bob StrahanBob Strahan is a Principal Solutions Architect in the AWS Language AI Services team.

Christopher LottChristopher Lott is a Senior Solutions Architect in the AWS AI Language Services team. He has 20 years of enterprise software development experience. Chris lives in Sacramento, California and enjoys gardening, aerospace, and traveling the world.

Jeremy LinJeremy Lin is a Cloud Support Engineer with focus on AWS Serverless service. He is inspired by customer inquiries and enjoys working with customers to dive deep to help solve their problems.

Ting-Yao ChangTing-Yao Chang is a Solutions Architect based in Taiwan. He excels at understanding customer needs and designing architectures. Recently, he has been focusing on generative AI and assisting customers in building customized solutions. In his leisure time, he enjoys playing video games, singing, and dancing.