Master of AIs - Multi-model collaborative brainstorming platform
Add mention_handler for @claude, @gpt, @gemini direct messages: - Parse model name from @mention prefix - Get active discussion for context (if exists) - Query model via query_model_direct with full context - Persist response with is_direct=True flag - Register MessageHandler with regex filter after CommandHandlers Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> |
||
|---|---|---|
| .planning | ||
| src/moai | ||
| tests | ||
| .env.example | ||
| .gitignore | ||
| .pre-commit-config.yaml | ||
| CLAUDE.md | ||
| pyproject.toml | ||
| README.md | ||
| SPEC.md | ||
🗿 MoAI
Master of AIs
Multi-model AI roundtable for collaborative brainstorming
💡 The Idea
Instead of asking one AI a question, ask a team of AIs.
You: "How should we architect our new microservice?"
┌─────────┐ ┌─────────┐ ┌─────────┐
│ Claude │ │ GPT │ │ Gemini │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
▼ ▼ ▼
"Event "Start "Consider
sourcing simple, the team's
gives you monolith experience
audit first..." level..."
trails..."
│ │ │
└────────────┼────────────┘
▼
🤝 Discussion Rounds
▼
📋 Consensus Summary
Models see each other's responses, debate, agree, disagree, and build upon ideas. You get richer, more diverse insights—then export everything as clean markdown.
✨ Features
| Feature | Description |
|---|---|
| 🔀 Open Mode | Ask all models in parallel—get diverse initial takes |
| 💬 Discuss Mode | Sequential rounds where models respond to each other |
| 🎯 @Mentions | Direct questions to specific models: @claude explain more |
| 🤝 Consensus | Auto-generated summary of agreements & disagreements |
| 📁 Projects | Organize discussions by topic with persistent context |
| 📤 Export | Beautiful markdown documents of entire discussions |
🚀 Quick Start
# Clone & setup
git clone https://git.example.com/mikkel/moai.git
cd moai
uv sync # or: pip install -e ".[dev]"
# Configure
cp .env.example .env
# Edit .env with your tokens
# Run
python -m moai.bot.main
🤖 Telegram Commands
Projects
/projects List all projects
/project new "Name" Create project
/project select <id> Switch project
/project models claude,gpt,gemini
Discussion
/open <question> Ask all models (parallel)
/discuss [rounds] Start discussion (default: 3 rounds)
@claude <message> Direct message to Claude
@gpt <message> Direct message to GPT
/stop End current discussion
Output
/consensus Generate agreement summary
/export Export as markdown
🏗️ Architecture
┌──────────────────┐ ┌──────────────────────────────────┐
│ Telegram Bot │────▶│ Python Backend │
│ (thin client) │ │ │
└──────────────────┘ │ ┌────────────┐ ┌─────────────┐ │
│ │ Handlers │ │ Orchestrator│ │
│ └─────┬──────┘ └──────┬──────┘ │
│ │ │ │
│ ▼ ▼ │
│ ┌──────────┐ ┌───────────┐ │
│ │ SQLite │ │ AI Client │ │
│ └──────────┘ └─────┬─────┘ │
└───────────────────────┼─────────┘
│
┌───────────────────────┼─────────┐
│ Requesty / OpenRouter │
│ ┌─────────┬─────────┬────────┐ │
│ │ Claude │ GPT │ Gemini │ │
│ └─────────┴─────────┴────────┘ │
└─────────────────────────────────┘
📍 Roadmap
- M1 Bot basics (
/help,/status) - M2 Project CRUD
- M3 Single model Q&A
- M4 Open mode (parallel)
- M5 Discuss mode (sequential)
- M6 Consensus generation
- M7 Markdown export
- M8 @mention support
- Phase 2 Web UI
🛠️ Tech Stack
| Layer | Technology |
|---|---|
| Bot | python-telegram-bot (async) |
| Backend | Python 3.11+, FastAPI |
| Database | SQLite → PostgreSQL |
| AI Routing | Requesty / OpenRouter |
| Linting | Ruff |
| Testing | pytest |
📄 License
MIT © 2026
Why ask one AI when you can ask them all? 🗿