Hi everyone! I’ve been working on a desktop app that makes local LLM fine-tuning more accessible for Mac users.
What it does:
- Import documents → Generate training data with Ollama → Fine-tune with mlx-lm → Export to Ollama
- Runs entirely locally, no cloud/API required
- Built with Tauri + React, leverages Apple’s MLX framework
Why I built it:
The mlx-lm library is powerful, but the CLI workflow can be intimidating. This app wraps everything into a GUI while keeping the full flexibility.
Would love to hear feedback from the community!