Inference is all you need. Everything else can be owned by you.

A native Mac app for any AI provider, local or cloud. Your models, tools, memory, and context — portable and yours. Open source.

Download for Mac
--- stars · --- downloads · ~10MB native Swift

AI inference is becoming a commodity. What’s irreplaceable is your context, your memory, and your data. Osaurus keeps all of it on your machine — private, portable, and independent of any single provider.

Read: On Personal AI →
What it does
Local models
Download and run Llama, Qwen, Gemma, Mistral, and more locally on Apple Silicon with optimized MLX inference.
Provider agnostic
Connect to OpenAI, Anthropic, Gemini, xAI, or OpenRouter. Your context persists across all of them.
Agents
Custom AI assistants with unique prompts, tools, and visual themes. Each agent is tailored to a different task.
Work Mode
Autonomous multi-step task execution with issue tracking, parallel tasks, and file operations.
Automation
Schedule recurring AI tasks and watch folders for changes. Your AI works in the background so you don't have to.
Skills
Extend with community skills imported directly from GitHub repositories.
MCP
Both server and client. Expose tools to Cursor and Claude Desktop, or connect to remote MCP servers and aggregate their tools.
Voice
Speech-to-text with WhisperKit. VAD mode for hands-free activation, plus transcription into any app.
Tools & Plugins
20+ native plugins: Mail, Calendar, Vision, macOS Use, XLSX, PPTX, browser automation, and more.
Works with
Local / MLXOpenAIAnthropicGeminixAI / GrokOpenRouter
~10MB
App size
---
Downloads
---
GitHub stars
MIT
Open source

Own your AI.

Native Swift. Open source. Provider-agnostic. The way personal AI should be.

macOS 15.5+ · Apple Silicon