Betswish/Cross-Lingual-Consistency
Easy-to-use framework for evaluating cross-lingual consistency of factual knowledge (Supported LLaMA, BLOOM, mT5, RoBERTa, etc.) Paper here: https://aclanthology.org/2023.emnlp-main.658/
This framework helps AI researchers and developers assess if a multilingual language model provides consistent factual information across different languages. You input a large language model and specify two languages, and it outputs a score indicating how consistent the model's factual knowledge is between those languages. It's designed for those who work on developing or evaluating multilingual AI systems and need to ensure fairness and reliability across user languages.
No commits in the last 6 months.
Use this if you are a researcher or developer concerned with how consistently multilingual large language models retrieve factual knowledge across different languages.
Not ideal if you are an end-user simply looking to use a language model and not evaluate its internal cross-lingual consistency.
Stars
27
Forks
1
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 08, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Betswish/Cross-Lingual-Consistency"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
PaddlePaddle/PaddleNLP
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
meta-llama/llama-cookbook
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started...
arcee-ai/mergekit
Tools for merging pretrained large language models.
changyeyu/LLM-RL-Visualized
๐100+ ๅๅ LLM / RL ๅ็ๅพ๐๏ผใๅคงๆจกๅ็ฎๆณใไฝ่ ๅทจ็ฎ๏ผ๐ฅ๏ผ100+ LLM/RL Algorithm Maps ๏ผ
mindspore-lab/step_into_llm
MindSpore online courses: Step into LLM