We are pleased to offer a preview of the QnAIntent in Amazon Lex. The QnAIntent allows developers to securely connect foundation models (FM) to company data for Retrieval Augmented Generation (RAG). By providing access to company data, FMs generate more relevant, accurate, and contextual responses. The QnA Intent can be used with new or existing Lex bots to automate frequently asked questions (FAQs) through text and voice channels, such as Amazon Connect.
The QnAIntent helps bot developers automate customer questions and avoid unnecessary transfers to human representatives. Developers no longer need to predict and handle a wide range of FAQs by creating many variations of intents, sample utterances, slots, and prompts. By simply connecting the new QnAIntent to company knowledge sources, a bot can immediately handle questions on the allowed content such as “what documents do I need to submit for an accident claim?”. The QnAIntent supports Knowledge Bases for Amazon Bedrock, Amazon OpenSearch, and Amazon Kendra. Developers can also choose between a generative response summary or an exact response match, providing control over the bot response content. QnAIntent is now available in preview in the English language in US East (N. Virginia) and US West (Oregon) regions. To learn more, visit the Amazon Lex documentation page.