The Local-First AI Runtime
for Apple Silicon
Lightning-fast inference powered by Swift. OpenAI-compatible API with zero cloud dependency. Build the next generation of AI applications with complete privacy.
Why Osaurus?
Built for the future of private, local AI inference
Blazing Fast
Swift-powered runtime optimized specifically for Apple silicon M-series chips, delivering unmatched inference speed.
Privacy First
Your data never leaves your device. All inference happens locally, ensuring complete privacy and data sovereignty.
Developer Ready
OpenAI-compatible API means your existing code just works. One-line install with brew, comprehensive SDK, and extensive documentation.
Open Source
MIT licensed and fully extensible. Join our community to shape the future of local AI.
Lightweight
Just 7MB runtime footprint. Efficient resource usage without compromising on performance.
Works Offline
No internet connection required. Run AI models anywhere, anytime, completely offline.
Start Building in Seconds
Compatible with OpenAI SDK - your existing code just works
swift1. Install
brew install osaurus2. Start Server
osaurus serve3. Build
Use any OpenAI SDKBuilt with Osaurus
Discover what developers are building with Osaurus runtime
Dinoki
Native AI desktop companion with smart chat, offline support, and 300+ AI models
LiveTweaks
AI-powered text enhancement for macOS that improves clipboard text with a global hotkey
LiveDyad
Free, local, open-source AI app builder for creating full-stack apps without coding
LiveEnchanted
Ollama-compatible app for private AI models across macOS, iOS, and Vision Pro
LiveCursor
AI-powered code editor with real-time suggestions and intelligent pair-programming
Coming SoonRaycast
macOS productivity launcher with powerful extensions and keyboard-driven workflows
Coming SoonYour App Here
Built something with Osaurus? We'd love to feature it!
Submit AppReady to Build?
Join thousands of developers building private AI applications. Get started in seconds with brew install osaurus