M-Courtyard - Native Mac app for local fine-tuning (export to LM Studio/Ollama)

Hi everyone,

I’ve been working on M-Courtyard, an open-source, fully local fine-tuning app built specifically for Apple Silicon (using mlx_lm).

I built this because I found the workflow on Mac a bit fragmented. Training is one thing, but actually using the model afterwards was often a pain.

With the new v0.4.7 update, I’ve focused on closing that loop. You can now:

  • Export to MLX: Directly outputs fused safetensors that you can drag-and-drop into LM Studio.

  • Local Inference Server: One-click start a local OpenAI-compatible API server. Connects instantly to OpenWebUI, Chatbox, or your own code.

  • Ollama: Still supports 1-click export to Ollama, but now keeps the fused model files too.

It handles the setup for you (Python env, dependencies) so you don’t need to fight with terminal configs.

Repo: https://github.com/Mcourtyard/m-courtyard Download (DMG): Releases Page

Would love any feedback from Mac users here!

2 Likes

This looks really cool - I will check it out!

1 Like

We welcome your valuable suggestions.:clap:

1 Like