mistralai/mistral-inference

Official inference library for Mistral models

56
/ 100
Established

This is a tool for developers who want to run Mistral's large language models (LLMs) on their own hardware. It allows you to take pre-trained Mistral models and process text or code inputs to generate text, code, or other relevant outputs. It's used by machine learning engineers or AI practitioners looking to integrate Mistral models into their applications or research.

10,705 stars.

Use this if you are an AI developer looking to locally deploy and experiment with Mistral's range of open-weight large language models for various text generation or coding tasks.

Not ideal if you are an end-user without programming knowledge or if you prefer a ready-to-use application rather than deploying models yourself.

large-language-models code-generation text-generation local-deployment machine-learning-engineering
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

10,705

Forks

1,024

Language

Jupyter Notebook

License

Apache-2.0

Category

mistral-ai-tools

Last pushed

Feb 26, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mistralai/mistral-inference"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.