richzhang/PerceptualSimilarity
LPIPS metric. pip install lpips
This project helps image professionals accurately compare how similar two images or image patches are, moving beyond simple pixel-by-pixel comparisons. You provide it with two images, and it outputs a score reflecting their perceptual similarity as judged by humans. Anyone working with image quality assessment, image generation, or image processing who needs a reliable way to quantify visual differences would find this tool useful.
4,185 stars. No commits in the last 6 months.
Use this if you need to objectively quantify how visually similar two images or image modifications are, aligning with human perception.
Not ideal if you only need basic, mathematically exact pixel differences like Euclidean distance, or if your application does not involve visual perception.
Stars
4,185
Forks
523
Language
Python
License
BSD-2-Clause
Category
Last pushed
Jul 02, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/richzhang/PerceptualSimilarity"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Jittor/jittor
Jittor is a high-performance deep learning framework based on JIT compiling and meta-operators.
zhanghang1989/ResNeSt
ResNeSt: Split-Attention Networks
berniwal/swin-transformer-pytorch
Implementation of the Swin Transformer in PyTorch.
NVlabs/FasterViT
[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with...
ViTAE-Transformer/ViTPose
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose...