Huonyx

by @hunix

Huonyx — Unified AI Agent Platform integrating Onyx AI backend with OpenClaw multi-channel gateway

Install with ClawLodge CLIRequires clawlodge-cli
npm install -g clawlodge-cli
clawlodge install hunix-huonyx
Best for
Includes
18 Filesv0.1.0

Huonyx — Unified AI Agent Platform

Huonyx is a unified AI agent platform that integrates the Onyx AI backend with the OpenClaw multi-channel gateway. The result is a single, self-hosted system that combines enterprise-grade RAG, deep research, and code execution with a personal AI assistant that reaches you on every messaging channel you already use.

Architecture

Huonyx is composed of two complementary stacks that communicate over a shared Docker network. The OpenClaw Gateway is a TypeScript/Node.js runtime that manages multi-channel messaging, agent sessions, tool calling, MCP integration, real-time voice, and media generation. The Onyx Backend is a Python/FastAPI service that provides RAG search, deep research, 50+ data connectors, document indexing, knowledge graphs, code interpretation, and a rich Next.js web frontend.

The Onyx Bridge (src/onyx-bridge/) is the integration layer that connects the two stacks. It exposes Onyx's capabilities as callable tools within the OpenClaw agent runtime and provides HTTP proxy routes through the gateway.

LayerTechnologyPurpose
OpenClaw GatewayTypeScript, Node.jsMulti-channel messaging, agent runtime, tool calling, MCP, voice, media
Onyx API ServerPython, FastAPIRAG, deep research, LLM orchestration, connectors, code interpreter
Onyx Web FrontendNext.js, ReactRich chat UI, admin panel, connector management
Onyx BridgeTypeScriptIntegration layer connecting Gateway to Onyx API
PostgreSQLSQLRelational data, auth, chat history
VespaVector searchDocument indexing and semantic search
RedisCacheSession cache, task queues
NginxReverse proxyUnified HTTP entry point

Quick Start

Clone the repository and copy the environment template:

git clone https://github.com/hunix/Huonyx.git
cd Huonyx
cp .env.template .env

Edit .env to set your LLM API keys and any other configuration, then start all services:

docker compose up -d

For a lightweight deployment without vector search or model servers (chat and tools still work):

docker compose -f docker-compose.yml -f docker-compose.onyx-lite.yml up -d

Access the services at the following addresses:

ServiceURL
Onyx Web UIhttp://localhost:80
Onyx APIhttp://localhost:8080
OpenClaw Gatewayhttp://localhost:18789

Directory Structure

Huonyx/
├── src/                    # OpenClaw TypeScript source
│   ├── agents/             # Agent runtime, tool calling, sessions
│   ├── channels/           # 20+ messaging channels
│   ├── gateway/            # WebSocket/HTTP gateway server
│   ├── mcp/                # MCP bridge
│   ├── onyx-bridge/        # ★ Integration layer (Onyx ↔ OpenClaw)
│   ├── image-generation/   # Image generation
│   ├── media-generation/   # Media generation
│   ├── music-generation/   # Music generation
│   ├── video-generation/   # Video generation
│   ├── tts/                # Text-to-speech
│   ├── realtime-voice/     # Real-time voice
│   ├── terminal/           # Terminal/shell
│   └── web-search/         # Web search
├── backend/                # Onyx Python backend
│   └── onyx/               # FastAPI app (chat, tools, LLM, connectors, RAG)
├── web/                    # Onyx Next.js frontend
├── ui/                     # OpenClaw Vite admin UI
├── deployment/             # Docker Compose configs, Helm, Terraform
├── packages/               # OpenClaw shared packages
├── skills/                 # OpenClaw skills (extensible agent capabilities)
├── docker-compose.yml      # Unified orchestration
├── docker-compose.onyx-lite.yml  # Lite overlay (no vector DB)
├── Dockerfile              # OpenClaw gateway image
├── .env.template           # Configuration template
└── README.md

Messaging Channels

Huonyx inherits OpenClaw's support for 20+ messaging channels. Configure any combination of these in the gateway:

WhatsApp, Telegram, Slack, Discord, Google Chat, Signal, iMessage, BlueBubbles, IRC, Microsoft Teams, Matrix, Feishu, LINE, Mattermost, Nextcloud Talk, Nostr, Synology Chat, Tlon, Twitch, Zalo, WeChat, and WebChat.

Onyx Bridge

The Onyx Bridge (src/onyx-bridge/) provides three integration points:

The Client (client.ts) is a typed HTTP client for the Onyx API that handles chat sessions, streaming messages, document search, personas, connectors, and LLM provider management.

The Tool Provider (tools.ts) exposes Onyx capabilities as tool definitions that the OpenClaw agent can invoke during conversations. This includes RAG search, deep research, document search, and connector listing.

The Routes (routes.ts) registers HTTP proxy endpoints on the OpenClaw gateway that forward requests to the Onyx backend, providing a unified API surface.

Development

The OpenClaw gateway (TypeScript) and the Onyx backend (Python) can be developed independently. For the gateway:

pnpm install
pnpm build
pnpm start

For the Onyx backend:

cd backend
pip install -e ".[dev]"
uvicorn onyx.main:app --reload --port 8080

Future: Docker Agentic Sandbox

The next phase of Huonyx development will add a Docker-based agentic sandbox and orchestration layer, enabling the agent to execute code, browse the web, and manipulate files in isolated containers — similar to Manus-style autonomous agent capabilities.

License

This project is licensed under the MIT License. See LICENSE for details.

Huonyx integrates Onyx (MIT License) and OpenClaw (MIT License).

Workspace

Updated 2026-05-04 18:07:30Published via clawlodge-cli/0.1.8
AGENTS.md
text · 40.8 KB

Loading preview...

No comments yet.

Related Lobsters