Amazon Lex now supports LLMs as the primary option for natural language understanding
Share
Services
Amazon Lex now allows you to use Large Language Models (LLMs) as the primary option to understand customer intent across voice and chat interactions. With this capability, your voice and chat bots can better understand customer requests, handle complex utterances, maintain accuracy despite spelling errors, and extract key information from verbose inputs. When customer intent is unclear, bots can intelligently ask follow-up questions to fulfill requests accurately. For example, when a customer says “I need help with my flight,” the LLM automatically clarifies whether the customer wants to check their flight status, upgrade their flight, or change their flight.
This feature is available in all AWS commercial regions where Amazon Connect and Lex operate. To learn more, visit the [Amazon Lex documentation](https://docs.aws.amazon.com/lexv2/latest/dg/generative-intent-disambiguation.html) or explore the Amazon Connect [website](https://aws.amazon.com/connect/self-service/) to learn how Amazon Connect and Amazon Lex deliver seamless end-customer self-service experiences.
What else is happening at Amazon Web Services?
AWS Shield network security director now supports multi-account analysis
about 16 hours ago
Services
Share
Read update
Services
Share
Amazon EMR Managed Scaling is now available in 7 additional AWS regions
about 16 hours ago
Services
Share
Amazon EC2 X2iedn instances now available in AWS Europe (Zurich) region
about 23 hours ago
Services
Share
AWS DataSync introduces Terraform support for Enhanced mode
about 23 hours ago
Services
Share
Validate best practice compliance for SAP ABAP applications with AWS Systems Manager
about 23 hours ago
Services
Share