Chunjiang-Intelligence/Credal-Transformer
论文「Credal Transformer: A Principled Approach for Quantifying and Mitigating Hallucinations in Large Language Models」的官方实现。
Stars
12
Forks
2
Language
Python
License
GPL-3.0
Category
Last pushed
Nov 08, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Chunjiang-Intelligence/Credal-Transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
bahree/helloLondon
Historical Language Model for London - A specialized LLM trained on 1500-1850 historical English text
MihneaTeodorStoica/mono-lm
Character-level language model focused on training, architecture, and optimization.
imreallyexited/Independent-LLM-Project
PyTorch framework for building and pre-training LLM's.
gurpejsingh13/punjabi-gpt-scratch-20m
Developed and pre-trained a 20.39M-parameter Punjabi GPT-style base model from scratch,...
Konohamaru04/Tiny-LLM
Tiny GPT-style LLM built from scratch in PyTorch with tokenizer training, transformer...