Inference is all you need. Everything else can be owned by you.

The AI harness for macOS. Your agents, memory, tools, and identity — portable across any model, local or cloud. Open source.

Download for Mac
--- stars · --- downloads · --- native Swift

Models are getting cheaper and more interchangeable by the day. What’s irreplaceable is the layer around them — your context, your memory, your tools, your identity. Others keep that layer on their servers. Osaurus keeps it on your machine.

Read: On Decentralized Acceleration →
What it does
Local models
Run Llama, Qwen, Gemma, Mistral, DeepSeek, and more on Apple Silicon with optimized MLX inference. Fully private, fully offline.
Provider agnostic
Connect to OpenAI, Anthropic, Gemini, xAI, Venice AI, OpenRouter, Ollama, or LM Studio. Your context persists across all of them.
Memory
4-layer memory with a knowledge graph. Learns from every conversation automatically — extracts facts, detects contradictions, and recalls relevant context. All stored locally.
Agents
Each agent gets its own prompts, tools, memory, and visual theme — a research assistant, a coding partner, a file organizer, whatever you need.
Work Mode
Give an agent an objective. It breaks the work into trackable issues, executes step by step — parallel tasks, file operations, background processing.
Sandbox
Agents execute code in an isolated Linux VM powered by Apple Containerization. Full dev environment — shell, Python, Node.js, compilers — with zero risk to your Mac.
Automation
Schedule recurring AI tasks and watch folders for changes. Your AI works in the background so you don't have to.
Skills
Import reusable AI capabilities from GitHub repos or files. Compatible with Agent Skills.
MCP
Both server and client. Expose tools to Cursor and Claude Desktop, or connect to remote MCP servers and aggregate their tools.
Identity
Cryptographic identity for every participant. Agents, humans, and devices each get a secp256k1 address — all actions signed and verifiable without a central authority.
Relay
Secure web tunneling to reach your local agents from anywhere. Unique URL per agent based on its crypto address. No port forwarding, no configuration.
Voice
On-device transcription via FluidAudio on Apple's Neural Engine. Voice input in chat, VAD mode with wake-word activation, and a global hotkey to transcribe into any app.
Tools & Plugins
20+ native plugins: Mail, Calendar, Vision, macOS Use, XLSX, PPTX, browser automation, and more. Plugins support v1 and v2 ABIs — register HTTP routes, serve web apps, and persist data.
Works with
Local / MLXLiquid AIApple Foundation ModelsOpenAIAnthropicGeminixAI / GrokVenice AIOpenRouterOllamaLM Studio
---
App size
---
Downloads
---
GitHub stars
MIT
Open source

Own your AI.

Native Swift on Apple Silicon. No Electron. No compromises. Provider-agnostic. MIT licensed.

macOS 15.5+ · Apple Silicon