LLMBot is a Matrix chatbot built in Rust that brings AI-powered conversations into the Matrix ecosystem. It connects to any OpenAI-compatible API — whether that's OpenAI itself, Ollama, Gemini, or any other compatible provider — and delivers intelligent responses directly within encrypted Matrix rooms.
The key idea is that your conversations stay private. LLMBot supports full end-to-end encryption with SAS emoji verification, so the AI integration doesn't compromise the security guarantees that Matrix provides.
Full E2EE support with SAS emoji verification. Your AI conversations are just as secure as regular Matrix messages.
Works with any OpenAI-compatible API: OpenAI, Ollama, Gemini, and more. Swap providers without changing your setup.
Follows message reply chains to reconstruct conversation context, enabling natural multi-turn conversations.
Each room can have its own system prompt, temperature, and model settings. One bot, many personalities.
Three-tier access: admin-only commands, allowed users/rooms/servers, and restricted user lists for fine-grained control.
The bot only responds when explicitly mentioned, keeping rooms clean and conversations intentional.
When a user mentions the bot or replies in a thread, LLMBot decrypts the message, walks the reply chain to build conversation context, sends it to the configured LLM provider, and returns the response — all within the encrypted channel. Room-specific settings and session data are persisted in a local SQLite database.