aws-samples/sample-gen-ai-evaluations-workshop
This workshop teaches systematic approaches to evaluating Generative AI workloads for production use. You'll learn to build evaluation frameworks that go beyond basic metrics to ensure reliable model performance while optimizing cost and performance.
This workshop helps you ensure your Generative AI applications deliver accurate, cost-effective, and reliable results before and after they go live. You'll learn how to set up robust testing frameworks that take your AI outputs and measure them against quality, performance, and cost benchmarks. This is for AI solution architects, machine learning engineers, and product managers responsible for deploying and maintaining Generative AI systems.
Use this if you are building or deploying Generative AI applications and need a systematic way to evaluate their performance, cost, and output quality.
Not ideal if you are looking for a basic introduction to Generative AI concepts rather than hands-on evaluation strategies.
Stars
27
Forks
4
Language
Jupyter Notebook
License
MIT-0
Category
Last pushed
Mar 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/aws-samples/sample-gen-ai-evaluations-workshop"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
GoogleCloudPlatform/vertex-ai-samples
Notebooks, code samples, sample apps, and other resources that demonstrate how to use, develop...
neo4j-partners/hands-on-lab-neo4j-and-google
Hands on Lab for Neo4j and Google
lynnlangit/learning-cloud
Courses, sample code, articles & screencasts - AWS, Azure, & GCP
GoogleCloudPlatform/applied-ai-engineering-samples
This repository compiles code samples and notebooks demonstrating how to use Generative AI on...
streamlit/30DaysOfAI
30 Days of AI