TIGER-AI-Lab/VideoScore

official repo for "VideoScore: Building Automatic Metrics to Simulate Fine-grained Human Feedback for Video Generation" [EMNLP2024]

38
/ 100
Emerging

This tool helps video developers and researchers automatically evaluate the quality of AI-generated videos. It takes a text prompt and the generated video as input, then outputs a score across various aspects like visual quality, temporal consistency, and alignment with the text. This allows for quick, objective assessment without needing extensive human review.

113 stars.

Use this if you are developing or comparing video generation models and need an automated, reliable way to score their output against human judgment benchmarks.

Not ideal if you need a creative review or subjective artistic critique of video content, as it focuses on measurable quality metrics.

AI video generation video quality assessment model evaluation generative AI video content analysis
No Package No Dependents
Maintenance 6 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

113

Forks

5

Language

Python

License

MIT

Last pushed

Dec 04, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/TIGER-AI-Lab/VideoScore"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.