We’ve been working on an project called Liquidos — Odessey, and wanted to share an early research preview to get feedback from the community.
Odessey is a Rust-first agent orchestration runtime focused on local-first execution and privacy. It came out of frustration with cloud-heavy agent frameworks that are difficult to reason about, secure, or run reliably on the edge. We wanted something closer to a runtime layer for agents rather than just another workflow abstraction.
Under the hood, Odessey is built on top of our open-source Rust agent framework, AutoAgents, and explores ideas like composing and orchestrating agents locally, skill execution, plugins, and sandboxed runtimes. The long-term direction is toward what we loosely call an agentic operating system — though we’re still pressure-testing whether that framing actually makes sense.
This demo is intentionally limited. It’s a preview of direction rather than a feature-complete product, and we’re sharing it early to learn before open-sourcing the full platform.
We’d really appreciate thoughts on:
Whether you’d consider using something like this, and for what use cases
What would need to exist for real-world adoption
Whether the framing makes sense, or if it should be positioned differently
If you’re interested in agent orchestration, local-first AI tooling, Rust-based runtimes, or alternatives to cloud-centric LLM frameworks, we’d love to hear your perspective.
Open-source agent framework (Rust): https://github.com/liquidos-ai/AutoAgents
If this direction resonates, we’re also keeping a small waitlist at https://liquidos.ai to share updates as the platform evolves.
Happy to answer questions and take critiques.