philosophylocal-firstroadmap

Building on the Edge: Why We're Betting on Local-First AI

What's happening with Dinoki, where Osaurus fits in, and why we're building AI infrastructure that puts you at the center.

A question came up recently that I think a lot of you are wondering about: What's happening with Dinoki? Is Osaurus replacing it? What's the plan for 2026?

Fair questions. Here's the honest answer.


The Reality of Being a Solo Founder

Dinoki Labs is me. One person, building what I believe is important infrastructure for the future of AI on your devices. When you're a team of one, you can't build everything at once. You have to be strategic about where you focus.

Right now, that focus is Osaurus.

But Dinoki isn't going anywhere. Let me explain why Osaurus comes first.


The Trust Problem

When you install an AI assistant that wants access to your calendar, your messages, your contacts—you're being asked to trust that app with your digital life.

Most apps ask you to "just trust me bro."

That's not good enough.

System-level permissions need to be open source. You should be able to see exactly what's running on your machine, verify it yourself, or have others verify it for you. That transparency is Osaurus.

Think of it this way: Osaurus is the foundation. The runtime. The infrastructure layer that earns your trust through transparency. Dinoki is what gets built on top—an autonomous AI companion that can actually help you because you can actually trust it.

We have to build the foundation first.


What "Local-First" Actually Means

Let me be clear: local-first doesn't mean local-only.

Not everyone has an M-series Mac with 32GB of RAM. Not every task needs to run locally. Sometimes cloud models are faster, smarter, or simply more practical.

Local-first means you decide. You control when your data stays on your device and when it reaches out to the cloud. You're not locked into one provider's ecosystem. You're not paying twice—once for the app, again for the inference.

Osaurus already supports OpenAI, Anthropic, Ollama, LM Studio, and local MLX models. Use what makes sense for you. The point is that the choice is yours.


Building on the Edge

If AI is the brain, the edge is where it meets you. Your fingertips. Your screen. Your machine. It's the space between human and digital—and it's where we've chosen to build.

The industry is focused on the data center. We're focused on you.

That means building native. Not because it's easier—it's actually much harder. But native software belongs on your machine. It respects your system. It integrates with your calendar, your messages, your contacts. It feels like it was made for you, because it was.

There's a difference between software built to ship fast and software built to last. We chose the harder path because it's the right one.

Your computer is yours. Your data is yours. Your AI should be yours too—not a window into someone else's server that you're renting access to. When you use cloud models through Osaurus, you're still in control. Your choice of provider. Your data, your terms.

That's what we mean by local-first. Not local-only. Local as the default. You at the center.


The Road Ahead

Dinoki has an active roadmap for 2026. It's not abandoned—it's waiting for its foundation to mature.

Osaurus is moving fast. Personas just shipped in 0.6.0. The plugin system is growing. MCP support means it already works with tools like Cursor and Claude Desktop.

2026 is about growth. About building the community and ecosystem that makes all of this sustainable.


Follow Along

This is an indie project, built in public. If you believe in local-first, open-source AI infrastructure, here's how to support:

Star us on GitHub — It takes two seconds and helps more people discover the project.

Join the Discord — Where the community hangs out, shares feedback, and shapes what we build next.

Follow on X — Building in public means you'll see the wins, the struggles, and everything in between.