sudoAPWH / LLMBot-Matrix

View on GitHub
Rust matrix chatbot encryption llm
About this project

What is LLMBot?

LLMBot is a Matrix chatbot built in Rust that brings AI-powered conversations into the Matrix ecosystem. It connects to any OpenAI-compatible API — whether that's OpenAI itself, Ollama, Gemini, or any other compatible provider — and delivers intelligent responses directly within encrypted Matrix rooms.

The key idea is that your conversations stay private. LLMBot supports full end-to-end encryption with SAS emoji verification, so the AI integration doesn't compromise the security guarantees that Matrix provides.

Core Concepts

🔒 End-to-End Encryption

Full E2EE support with SAS emoji verification. Your AI conversations are just as secure as regular Matrix messages.

🤖 Multi-API Support

Works with any OpenAI-compatible API: OpenAI, Ollama, Gemini, and more. Swap providers without changing your setup.

💬 Reply Chain Memory

Follows message reply chains to reconstruct conversation context, enabling natural multi-turn conversations.

🏠 Per-Room Customization

Each room can have its own system prompt, temperature, and model settings. One bot, many personalities.

👥 Access Control

Three-tier access: admin-only commands, allowed users/rooms/servers, and restricted user lists for fine-grained control.

📩 Mention-Only Mode

The bot only responds when explicitly mentioned, keeping rooms clean and conversations intentional.

How It Works

Matrix Room ──> LLMBot (Rust) ──> OpenAI-Compatible API │ │ │ │ @bot mention │ Decrypt + parse │ Generate response │ or reply chain │ Build context │ Stream back │ │ from reply chain │ ▼ ▼ ▼ Encrypted E2EE SQLite Storage Any LLM Provider transport room configs Ollama, OpenAI, session data Gemini, etc.

When a user mentions the bot or replies in a thread, LLMBot decrypts the message, walks the reply chain to build conversation context, sends it to the configured LLM provider, and returns the response — all within the encrypted channel. Room-specific settings and session data are persisted in a local SQLite database.

Design Decisions

  • Rust + Matrix SDK — Native performance and memory safety for a long-running daemon that handles cryptographic operations.
  • Reply chain context — Rather than maintaining a separate conversation history, the bot follows Matrix's native reply structure to reconstruct context naturally.
  • Provider-agnostic API — By targeting the OpenAI-compatible API spec, the bot works with a growing ecosystem of providers without code changes.
  • Per-room configuration — Different rooms can serve different purposes (coding help, general chat, roleplay) with distinct system prompts and parameters.
Back to all repositories