linonetwo/MOSS-DockerFile

用于在 Docker 里运行复旦的 MOSS 语言模型,使用 GradIO 提供 WebUI。

35
/ 100
Emerging

This project helps individuals run the Fudan MOSS large language model locally on their computer using Docker. You provide the MOSS model files, and it sets up a web interface through Gradio, allowing you to interact with the model directly in your browser. This is for researchers, developers, or enthusiasts who want to experiment with or use the MOSS model without complex setup.

No commits in the last 6 months.

Use this if you have a powerful computer with a NVIDIA GPU (e.g., a 3090ti or better) and want to run the MOSS language model in a user-friendly web interface.

Not ideal if you don't have access to a high-end NVIDIA GPU with at least 14GB of VRAM, or if you prefer cloud-based LLM solutions.

large-language-models local-AI-deployment AI-experimentation natural-language-processing machine-learning-operations
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

16

Forks

3

Language

Python

License

MIT

Last pushed

Dec 15, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/linonetwo/MOSS-DockerFile"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.