[Tool] M-Courtyard – GUI for local LLM fine-tuning on Apple Silicon

Hi everyone! I’ve been working on a desktop app that makes local LLM fine-tuning more accessible for Mac users.

What it does:

  • Import documents → Generate training data with Ollama → Fine-tune with mlx-lm → Export to Ollama
  • Runs entirely locally, no cloud/API required
  • Built with Tauri + React, leverages Apple’s MLX framework

Why I built it:
The mlx-lm library is powerful, but the CLI workflow can be intimidating. This app wraps everything into a GUI while keeping the full flexibility.

GitHub: GitHub - Mcourtyard/m-courtyard: M-Courtyard: Local AI Model Fine-tuning Assistant for Apple Silicon. Zero-code, zero-cloud, privacy-first desktop app powered by Tauri + React + mlx-lm.

Would love to hear feedback from the community!

1 Like