amazon-bedrock-rag and simplified-corrective-rag
These are ecosystem siblings, as both demonstrate different approaches to implementing RAG solutions with Amazon Bedrock, but **A** presents a fully managed solution leveraging Knowledge Bases while **B** showcases a more advanced "Corrective RAG" technique also using Agents for Amazon Bedrock.
About amazon-bedrock-rag
aws-samples/amazon-bedrock-rag
Fully managed RAG solution implemented using Knowledge Bases for Amazon Bedrock
This project helps you build a custom chatbot that can answer questions using your own private documents or website content. You provide your proprietary information, and the chatbot generates accurate answers, citing its sources from your data, instead of relying solely on generic internet knowledge. This is ideal for knowledge managers, customer support leads, or anyone needing to make internal company data or specific domain knowledge easily searchable and consumable through a conversational AI.
About simplified-corrective-rag
aws-samples/simplified-corrective-rag
How to build a simplified Corrective RAG assistant with Amazon Bedrock using LLMs, Embeddings model, Knowledge Bases for Amazon Bedrock, and Agents for Amazon Bedrock.
This project helps developers build more reliable AI assistants by addressing a common problem where large language models (LLMs) might 'hallucinate' or provide incorrect information. It takes an existing knowledge base and a user query, and if the knowledge base doesn't have the answer, it automatically performs a web search to find accurate information. This is for AI solution architects or machine learning engineers building generative AI applications who need to ensure accuracy.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work