Amazon has developed new technologies to enhance customer experience across multiple channels, such as natural language understanding and automatic speech recognition.
Amazon Lex bots can be used by businesses to integrate AI capabilities into their call centers, reducing handling times and streamlining tasks.
Generative AI expands the potential to improve customer experience, but security and compliance concerns put businesses off using them with customers.
The integration of Amazon Lex and Amazon Bedrock enables the use of large language models (LLMs) to enhance omnichannel customer experiences safely.
By using Amazon Lex for initial touchpoints and Amazon Bedrock as a secondary validation layer, LLMs can provide better intent classification, slot resolution assistance, and background noise mitigation.
Businesses can integrate this AI-driven experience with contact center solutions like Amazon Connect to deliver intelligent customer experiences across channels seamlessly.
To deploy this solution, users need an AWS account and access to FMs on Amazon Bedrock.
The workflow involves messages sent to the Amazon Lex omnichannel, with the Lambda function being invoked to handle certain phases of conversation.
Amazon Bedrock returns the intent or slot identified, or responds to Amazon Lex that it is unable to classify the utterance as related to intent or slot.
This solution requires specific models (FMs) to be selected from an AWS CloudFormation template through Amazon Bedrock to identify the intent, identify the slot or determine if the transcribed messages contain background noise.