ymcui/Chinese-Mixtral
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
This project provides pre-trained large language models specifically adapted for the Chinese language. It takes a base Mixtral model and enhances it with extensive Chinese data, allowing for advanced text generation and conversational AI in Chinese. Users include researchers, developers, or businesses who need robust Chinese language capabilities for applications like chatbots, content creation, or data analysis.
610 stars. No commits in the last 6 months.
Use this if you need a powerful, long-context language model that excels at understanding and generating Chinese text for various applications.
Not ideal if your primary use case is English-only or if you require a very small model for limited computational resources.
Stars
610
Forks
41
Language
Python
License
Apache-2.0
Category
Last pushed
Apr 30, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ymcui/Chinese-Mixtral"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ModelTC/LightCompress
[EMNLP 2024 & AAAI 2026] A powerful toolkit for compressing large models including LLMs, VLMs,...
p-e-w/heretic
Fully automatic censorship removal for language models
Orion-zhen/abliteration
Make abliterated models with transformers, easy and fast
YerbaPage/LongCodeZip
LongCodeZip: Compress Long Context for Code Language Models [ASE2025]
locuslab/wanda
A simple and effective LLM pruning approach.