aws-samples/rag-with-amazon-bedrock-and-documentdb
Question Answering Generative AI application with Large Language Models (LLMs), Amazon Bedrock, and Amazon DocumentDB (with MongoDB Compatibility)
This project helps you build a custom question-answering system for your business. You provide it with your company's documents and data, and it allows users to ask questions in plain language, receiving accurate answers drawn directly from your knowledge base. This is ideal for anyone who needs to quickly find specific information within a large collection of internal documents, such as customer support teams, research departments, or HR.
No commits in the last 6 months.
Use this if you need to create a secure, intelligent assistant that can answer questions based on your organization's specific data and documents.
Not ideal if you're looking for a general-purpose chatbot that doesn't rely on a custom, internal knowledge base.
Stars
10
Forks
1
Language
Jupyter Notebook
License
MIT-0
Category
Last pushed
Dec 03, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/aws-samples/rag-with-amazon-bedrock-and-documentdb"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aws-samples/generative-ai-use-cases
Application implementation with business use cases for safely utilizing generative AI in...
aws-samples/serverless-rag-demo
Amazon Bedrock Foundation models with Amazon Opensearch Serverless as a Vector DB
aws-samples/amazon-bedrock-rag
Fully managed RAG solution implemented using Knowledge Bases for Amazon Bedrock
IBM/granite-workshop
Source code for the IBM Granite AI Model Workshop
aws-samples/rag-with-amazon-bedrock-and-opensearch
Opinionated sample on how to build and deploy a RAG application with Amazon Bedrock and OpenSearch