OsaurusOsaurus
Osaurus
Now in Public Beta

The Local-First AI Runtime
for Apple Silicon

Lightning-fast inference powered by Swift. Complete privacy with zero cloud dependency. Built for the next generation of AI applications.

80ms
First Token Latency
7MB
Runtime Footprint
100%
Offline Ready
MIT
Open License

Why Osaurus?

Built for the future of private, local AI inference

Blazing Fast

Swift-powered runtime optimized specifically for Apple Silicon M-series chips, delivering unmatched inference speed.

Privacy First

Your data never leaves your device. All inference happens locally, ensuring complete privacy and data sovereignty.

Simple by Design

One-step installation, OpenAI-compatible API, and developer-friendly SDK make integration seamless.

Open Source

MIT licensed and fully extensible. Join our community to shape the future of local AI.

Lightweight

Just 7MB runtime footprint. Efficient resource usage without compromising on performance.

Works Offline

No internet connection required. Run AI models anywhere, anytime, completely offline.

Join the Local AI Revolution

Be part of the movement that's bringing AI back to your device. Privacy-first, blazing fast, and built for the future.

...
Downloads
...
GitHub Stars
100%
Open Source