Mcourtyard/m-courtyard
M-Courtyard: Local AI Model Fine-tuning Assistant for Apple Silicon. Zero-code, zero-cloud, privacy-first desktop app powered by Tauri + React + mlx-lm.
This desktop tool helps you create custom AI language models on your Apple Silicon Mac without needing to write code or use cloud services. You provide unstructured documents like PDFs or Word files, and it processes them into training data, fine-tunes a chosen AI model, and lets you export a customized local AI. It's designed for professionals or individuals who want to tailor large language models using their own sensitive or specialized data, ensuring complete privacy.
Use this if you need to fine-tune an AI model with your specific, proprietary, or sensitive data on your Apple Silicon Mac, and you want a straightforward, no-code desktop application that keeps everything private and local.
Not ideal if you don't have an Apple Silicon Mac, need to train extremely large models that exceed local hardware capabilities, or require a cloud-based solution for collaborative or distributed training.
Stars
67
Forks
5
Language
TypeScript
License
—
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Mcourtyard/m-courtyard"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
containers/ramalama
RamaLama is an open-source developer tool that simplifies the local serving of AI models from...
av/harbor
One command brings a complete pre-wired LLM stack with hundreds of services to explore.
RunanywhereAI/runanywhere-sdks
Production ready toolkit to run AI locally
runpod-workers/worker-vllm
The RunPod worker template for serving our large language model endpoints. Powered by vLLM.
foldl/chatllm.cpp
Pure C++ implementation of several models for real-time chatting on your computer (CPU & GPU)