aws-samples/sample-genai-reflection-for-bedrock

✨Amazon Bedrock wrapper for inference-time LLM techniques

41
/ 100
Emerging

This tool helps developers improve the performance of their Large Language Model (LLM) applications built on Amazon Bedrock. It takes your existing Bedrock prompts and can produce more accurate or cost-effective responses by allowing models to refine their answers, consult external verification systems, or collaborate with multiple models. Developers and MLOps engineers working with Bedrock will find this useful for optimizing their LLM deployments.

Use this if you are an MLOps engineer or developer looking to enhance the accuracy, reduce the cost, or lower the latency of your LLM solutions on Amazon Bedrock by implementing advanced inference techniques.

Not ideal if you are looking for a no-code solution or if your LLM applications are not built on Amazon Bedrock.

LLM deployment model optimization Generative AI MLOps Bedrock applications
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 15 / 25
Community 11 / 25

How are scores calculated?

Stars

12

Forks

2

Language

Python

License

MIT-0

Last pushed

Jan 15, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/aws-samples/sample-genai-reflection-for-bedrock"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.